[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11000 1726867137.67695: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-Isn executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11000 1726867137.68631: Added group all to inventory 11000 1726867137.68633: Added group ungrouped to inventory 11000 1726867137.68637: Group all now contains ungrouped 11000 1726867137.68640: Examining possible inventory source: /tmp/network-5rw/inventory.yml 11000 1726867138.00872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11000 1726867138.01026: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11000 1726867138.01050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11000 1726867138.01152: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11000 1726867138.01343: Loaded config def from plugin (inventory/script) 11000 1726867138.01345: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11000 1726867138.01387: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11000 1726867138.01597: Loaded config def from plugin (inventory/yaml) 11000 1726867138.01599: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11000 1726867138.01801: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11000 1726867138.02733: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11000 1726867138.02736: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11000 1726867138.02739: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11000 1726867138.02745: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11000 1726867138.02864: Loading data from /tmp/network-5rw/inventory.yml 11000 1726867138.02936: /tmp/network-5rw/inventory.yml was not parsable by auto 11000 1726867138.03119: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11000 1726867138.03157: Loading data from /tmp/network-5rw/inventory.yml 11000 1726867138.03348: group all already in inventory 11000 1726867138.03355: set inventory_file for managed_node1 11000 1726867138.03359: set inventory_dir for managed_node1 11000 1726867138.03360: Added host managed_node1 to inventory 11000 1726867138.03362: Added host managed_node1 to group all 11000 1726867138.03363: set ansible_host for managed_node1 11000 1726867138.03364: set ansible_ssh_extra_args for managed_node1 11000 1726867138.03367: set inventory_file for managed_node2 11000 1726867138.03369: set inventory_dir for managed_node2 11000 1726867138.03370: Added host managed_node2 to inventory 11000 1726867138.03372: Added host managed_node2 to group all 11000 1726867138.03372: set ansible_host for managed_node2 11000 1726867138.03373: set ansible_ssh_extra_args for managed_node2 11000 1726867138.03376: set inventory_file for managed_node3 11000 1726867138.03380: set inventory_dir for managed_node3 11000 1726867138.03381: Added host managed_node3 to inventory 11000 1726867138.03382: Added host managed_node3 to group all 11000 1726867138.03383: set ansible_host for managed_node3 11000 1726867138.03384: set ansible_ssh_extra_args for managed_node3 11000 1726867138.03387: Reconcile groups and hosts in inventory. 11000 1726867138.03390: Group ungrouped now contains managed_node1 11000 1726867138.03392: Group ungrouped now contains managed_node2 11000 1726867138.03394: Group ungrouped now contains managed_node3 11000 1726867138.03635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11000 1726867138.03868: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11000 1726867138.03918: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11000 1726867138.03946: Loaded config def from plugin (vars/host_group_vars) 11000 1726867138.03948: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11000 1726867138.04069: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11000 1726867138.04078: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11000 1726867138.04121: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11000 1726867138.04862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867138.04960: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11000 1726867138.05116: Loaded config def from plugin (connection/local) 11000 1726867138.05120: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11000 1726867138.06499: Loaded config def from plugin (connection/paramiko_ssh) 11000 1726867138.06507: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11000 1726867138.08291: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11000 1726867138.08330: Loaded config def from plugin (connection/psrp) 11000 1726867138.08333: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11000 1726867138.09950: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11000 1726867138.10095: Loaded config def from plugin (connection/ssh) 11000 1726867138.10098: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11000 1726867138.12973: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11000 1726867138.13015: Loaded config def from plugin (connection/winrm) 11000 1726867138.13018: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11000 1726867138.13053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11000 1726867138.13115: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11000 1726867138.13190: Loaded config def from plugin (shell/cmd) 11000 1726867138.13192: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11000 1726867138.13217: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11000 1726867138.13289: Loaded config def from plugin (shell/powershell) 11000 1726867138.13291: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11000 1726867138.13342: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11000 1726867138.13529: Loaded config def from plugin (shell/sh) 11000 1726867138.13531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11000 1726867138.13564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11000 1726867138.13692: Loaded config def from plugin (become/runas) 11000 1726867138.13694: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11000 1726867138.13884: Loaded config def from plugin (become/su) 11000 1726867138.13886: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11000 1726867138.14115: Loaded config def from plugin (become/sudo) 11000 1726867138.14117: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11000 1726867138.14408: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 11000 1726867138.15035: in VariableManager get_vars() 11000 1726867138.15057: done with get_vars() 11000 1726867138.15380: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11000 1726867138.19946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11000 1726867138.20065: in VariableManager get_vars() 11000 1726867138.20070: done with get_vars() 11000 1726867138.20073: variable 'playbook_dir' from source: magic vars 11000 1726867138.20080: variable 'ansible_playbook_python' from source: magic vars 11000 1726867138.20081: variable 'ansible_config_file' from source: magic vars 11000 1726867138.20082: variable 'groups' from source: magic vars 11000 1726867138.20082: variable 'omit' from source: magic vars 11000 1726867138.20083: variable 'ansible_version' from source: magic vars 11000 1726867138.20084: variable 'ansible_check_mode' from source: magic vars 11000 1726867138.20085: variable 'ansible_diff_mode' from source: magic vars 11000 1726867138.20085: variable 'ansible_forks' from source: magic vars 11000 1726867138.20086: variable 'ansible_inventory_sources' from source: magic vars 11000 1726867138.20087: variable 'ansible_skip_tags' from source: magic vars 11000 1726867138.20088: variable 'ansible_limit' from source: magic vars 11000 1726867138.20088: variable 'ansible_run_tags' from source: magic vars 11000 1726867138.20089: variable 'ansible_verbosity' from source: magic vars 11000 1726867138.20122: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml 11000 1726867138.21558: in VariableManager get_vars() 11000 1726867138.21573: done with get_vars() 11000 1726867138.21583: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11000 1726867138.22815: in VariableManager get_vars() 11000 1726867138.22830: done with get_vars() 11000 1726867138.22839: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11000 1726867138.22946: in VariableManager get_vars() 11000 1726867138.22972: done with get_vars() 11000 1726867138.23110: in VariableManager get_vars() 11000 1726867138.23124: done with get_vars() 11000 1726867138.23132: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11000 1726867138.23202: in VariableManager get_vars() 11000 1726867138.23222: done with get_vars() 11000 1726867138.23506: in VariableManager get_vars() 11000 1726867138.23518: done with get_vars() 11000 1726867138.23523: variable 'omit' from source: magic vars 11000 1726867138.23544: variable 'omit' from source: magic vars 11000 1726867138.23579: in VariableManager get_vars() 11000 1726867138.23591: done with get_vars() 11000 1726867138.23635: in VariableManager get_vars() 11000 1726867138.23651: done with get_vars() 11000 1726867138.23686: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11000 1726867138.23902: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11000 1726867138.24031: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11000 1726867138.24661: in VariableManager get_vars() 11000 1726867138.24679: done with get_vars() 11000 1726867138.25090: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11000 1726867138.25243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11000 1726867138.27506: in VariableManager get_vars() 11000 1726867138.27527: done with get_vars() 11000 1726867138.27537: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11000 1726867138.27725: in VariableManager get_vars() 11000 1726867138.27745: done with get_vars() 11000 1726867138.27867: in VariableManager get_vars() 11000 1726867138.27885: done with get_vars() 11000 1726867138.28283: in VariableManager get_vars() 11000 1726867138.28301: done with get_vars() 11000 1726867138.28306: variable 'omit' from source: magic vars 11000 1726867138.28339: variable 'omit' from source: magic vars 11000 1726867138.28380: in VariableManager get_vars() 11000 1726867138.28395: done with get_vars() 11000 1726867138.28416: in VariableManager get_vars() 11000 1726867138.28440: done with get_vars() 11000 1726867138.28470: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11000 1726867138.28592: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11000 1726867138.30430: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11000 1726867138.30847: in VariableManager get_vars() 11000 1726867138.30867: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11000 1726867138.32878: in VariableManager get_vars() 11000 1726867138.32900: done with get_vars() 11000 1726867138.32908: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11000 1726867138.33407: in VariableManager get_vars() 11000 1726867138.33434: done with get_vars() 11000 1726867138.33492: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11000 1726867138.33506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11000 1726867138.33755: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11000 1726867138.33933: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11000 1726867138.33936: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11000 1726867138.33976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11000 1726867138.34003: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11000 1726867138.34185: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11000 1726867138.34246: Loaded config def from plugin (callback/default) 11000 1726867138.34248: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11000 1726867138.35436: Loaded config def from plugin (callback/junit) 11000 1726867138.35439: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11000 1726867138.35483: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11000 1726867138.35550: Loaded config def from plugin (callback/minimal) 11000 1726867138.35552: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11000 1726867138.35589: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11000 1726867138.35652: Loaded config def from plugin (callback/tree) 11000 1726867138.35654: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11000 1726867138.35773: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11000 1726867138.35776: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-Isn/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_deprecated_nm.yml ***************************************** 2 plays in /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 11000 1726867138.35805: in VariableManager get_vars() 11000 1726867138.35824: done with get_vars() 11000 1726867138.35830: in VariableManager get_vars() 11000 1726867138.35839: done with get_vars() 11000 1726867138.35843: variable 'omit' from source: magic vars 11000 1726867138.35880: in VariableManager get_vars() 11000 1726867138.35894: done with get_vars() 11000 1726867138.35911: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_deprecated.yml' with nm as provider] *** 11000 1726867138.36444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11000 1726867138.36523: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11000 1726867138.36553: getting the remaining hosts for this loop 11000 1726867138.36555: done getting the remaining hosts for this loop 11000 1726867138.36557: getting the next task for host managed_node1 11000 1726867138.36561: done getting next task for host managed_node1 11000 1726867138.36563: ^ task is: TASK: Gathering Facts 11000 1726867138.36564: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867138.36567: getting variables 11000 1726867138.36568: in VariableManager get_vars() 11000 1726867138.36585: Calling all_inventory to load vars for managed_node1 11000 1726867138.36588: Calling groups_inventory to load vars for managed_node1 11000 1726867138.36591: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867138.36602: Calling all_plugins_play to load vars for managed_node1 11000 1726867138.36613: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867138.36617: Calling groups_plugins_play to load vars for managed_node1 11000 1726867138.36649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867138.36710: done with get_vars() 11000 1726867138.36716: done getting variables 11000 1726867138.36775: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Friday 20 September 2024 17:18:58 -0400 (0:00:00.011) 0:00:00.011 ****** 11000 1726867138.36805: entering _queue_task() for managed_node1/gather_facts 11000 1726867138.36806: Creating lock for gather_facts 11000 1726867138.37220: worker is 1 (out of 1 available) 11000 1726867138.37230: exiting _queue_task() for managed_node1/gather_facts 11000 1726867138.37357: done queuing things up, now waiting for results queue to drain 11000 1726867138.37359: waiting for pending results... 11000 1726867138.37479: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11000 1726867138.37566: in run() - task 0affcac9-a3a5-c734-026a-0000000000cd 11000 1726867138.37576: variable 'ansible_search_path' from source: unknown 11000 1726867138.37620: calling self._execute() 11000 1726867138.37695: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867138.37707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867138.37718: variable 'omit' from source: magic vars 11000 1726867138.37826: variable 'omit' from source: magic vars 11000 1726867138.37856: variable 'omit' from source: magic vars 11000 1726867138.37983: variable 'omit' from source: magic vars 11000 1726867138.37986: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867138.38040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867138.38064: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867138.38088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867138.38104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867138.38145: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867138.38154: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867138.38162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867138.38268: Set connection var ansible_shell_type to sh 11000 1726867138.38289: Set connection var ansible_pipelining to False 11000 1726867138.38302: Set connection var ansible_shell_executable to /bin/sh 11000 1726867138.38309: Set connection var ansible_connection to ssh 11000 1726867138.38319: Set connection var ansible_timeout to 10 11000 1726867138.38328: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867138.38364: variable 'ansible_shell_executable' from source: unknown 11000 1726867138.38372: variable 'ansible_connection' from source: unknown 11000 1726867138.38449: variable 'ansible_module_compression' from source: unknown 11000 1726867138.38453: variable 'ansible_shell_type' from source: unknown 11000 1726867138.38455: variable 'ansible_shell_executable' from source: unknown 11000 1726867138.38457: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867138.38459: variable 'ansible_pipelining' from source: unknown 11000 1726867138.38461: variable 'ansible_timeout' from source: unknown 11000 1726867138.38464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867138.38592: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867138.38605: variable 'omit' from source: magic vars 11000 1726867138.38612: starting attempt loop 11000 1726867138.38617: running the handler 11000 1726867138.38634: variable 'ansible_facts' from source: unknown 11000 1726867138.38656: _low_level_execute_command(): starting 11000 1726867138.38680: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867138.39486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867138.39505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867138.39547: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867138.39618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867138.39646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867138.39861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867138.41489: stdout chunk (state=3): >>>/root <<< 11000 1726867138.41583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867138.41613: stderr chunk (state=3): >>><<< 11000 1726867138.41622: stdout chunk (state=3): >>><<< 11000 1726867138.41652: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867138.41740: _low_level_execute_command(): starting 11000 1726867138.41746: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437 `" && echo ansible-tmp-1726867138.4165814-11025-92159888715437="` echo /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437 `" ) && sleep 0' 11000 1726867138.42394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867138.42442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867138.42459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867138.42480: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867138.42570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867138.44455: stdout chunk (state=3): >>>ansible-tmp-1726867138.4165814-11025-92159888715437=/root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437 <<< 11000 1726867138.44644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867138.44665: stdout chunk (state=3): >>><<< 11000 1726867138.44668: stderr chunk (state=3): >>><<< 11000 1726867138.44690: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867138.4165814-11025-92159888715437=/root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867138.44883: variable 'ansible_module_compression' from source: unknown 11000 1726867138.44888: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11000 1726867138.44891: ANSIBALLZ: Acquiring lock 11000 1726867138.44893: ANSIBALLZ: Lock acquired: 139984830862384 11000 1726867138.44895: ANSIBALLZ: Creating module 11000 1726867138.78418: ANSIBALLZ: Writing module into payload 11000 1726867138.78683: ANSIBALLZ: Writing module 11000 1726867138.78806: ANSIBALLZ: Renaming module 11000 1726867138.78825: ANSIBALLZ: Done creating module 11000 1726867138.78867: variable 'ansible_facts' from source: unknown 11000 1726867138.79040: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867138.79043: _low_level_execute_command(): starting 11000 1726867138.79046: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11000 1726867138.79770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867138.79788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867138.79813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867138.79831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867138.79927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867138.79941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867138.79956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867138.80036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867138.80156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867138.81925: stdout chunk (state=3): >>>PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 11000 1726867138.81995: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 11000 1726867138.82090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867138.82127: stdout chunk (state=3): >>><<< 11000 1726867138.82149: stderr chunk (state=3): >>><<< 11000 1726867138.82341: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867138.82349 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11000 1726867138.82353: _low_level_execute_command(): starting 11000 1726867138.82355: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11000 1726867138.82788: Sending initial data 11000 1726867138.82792: Sent initial data (1181 bytes) 11000 1726867138.83726: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867138.83784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867138.83802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867138.83861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867138.83992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867138.84030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867138.84066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867138.84100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867138.84269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867138.87584: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11000 1726867138.88282: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867138.88285: stdout chunk (state=3): >>><<< 11000 1726867138.88287: stderr chunk (state=3): >>><<< 11000 1726867138.88290: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867138.88292: variable 'ansible_facts' from source: unknown 11000 1726867138.88294: variable 'ansible_facts' from source: unknown 11000 1726867138.88296: variable 'ansible_module_compression' from source: unknown 11000 1726867138.88682: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11000 1726867138.88685: variable 'ansible_facts' from source: unknown 11000 1726867138.88911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/AnsiballZ_setup.py 11000 1726867138.89305: Sending initial data 11000 1726867138.89314: Sent initial data (153 bytes) 11000 1726867138.90491: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867138.90505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867138.90579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867138.92143: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867138.92196: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867138.92300: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpjcv461j2 /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/AnsiballZ_setup.py <<< 11000 1726867138.92316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/AnsiballZ_setup.py" <<< 11000 1726867138.92449: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpjcv461j2" to remote "/root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/AnsiballZ_setup.py" <<< 11000 1726867138.95514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867138.95580: stderr chunk (state=3): >>><<< 11000 1726867138.95589: stdout chunk (state=3): >>><<< 11000 1726867138.95613: done transferring module to remote 11000 1726867138.95632: _low_level_execute_command(): starting 11000 1726867138.95858: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/ /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/AnsiballZ_setup.py && sleep 0' 11000 1726867138.97043: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867138.97695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867138.97764: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867138.99607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867138.99611: stdout chunk (state=3): >>><<< 11000 1726867138.99613: stderr chunk (state=3): >>><<< 11000 1726867138.99634: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867138.99644: _low_level_execute_command(): starting 11000 1726867138.99653: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/AnsiballZ_setup.py && sleep 0' 11000 1726867139.00934: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867139.01186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867139.01305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867139.01404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867139.03523: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11000 1726867139.03547: stdout chunk (state=3): >>>import _imp # builtin <<< 11000 1726867139.03582: stdout chunk (state=3): >>>import '_thread' # <<< 11000 1726867139.03700: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 11000 1726867139.03727: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 11000 1726867139.03745: stdout chunk (state=3): >>>import 'time' # <<< 11000 1726867139.03758: stdout chunk (state=3): >>> import 'zipimport' # # installed zipimport hook <<< 11000 1726867139.03816: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.03828: stdout chunk (state=3): >>>import '_codecs' # <<< 11000 1726867139.03852: stdout chunk (state=3): >>>import 'codecs' # <<< 11000 1726867139.03888: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11000 1726867139.04023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff52184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff51e7b30> <<< 11000 1726867139.04152: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff521aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 11000 1726867139.04158: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11000 1726867139.04163: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11000 1726867139.04174: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11000 1726867139.04209: stdout chunk (state=3): >>>import 'os' # <<< 11000 1726867139.04220: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11000 1726867139.04282: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11000 1726867139.04357: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11000 1726867139.04408: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff502d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.04714: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff502dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11000 1726867139.04843: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11000 1726867139.04849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11000 1726867139.04955: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11000 1726867139.04958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11000 1726867139.04961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11000 1726867139.04963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11000 1726867139.05071: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff506be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11000 1726867139.05157: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff506bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11000 1726867139.05267: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 11000 1726867139.05369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50a37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50a3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5083ad0> import '_functools' # <<< 11000 1726867139.05398: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50811f0> <<< 11000 1726867139.05409: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5068fb0> <<< 11000 1726867139.05442: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11000 1726867139.05454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11000 1726867139.05517: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11000 1726867139.05911: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50c3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50c2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5082090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5068230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff50f8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff50f8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5066d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11000 1726867139.06088: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50fa480> import 'importlib.util' # import 'runpy' # <<< 11000 1726867139.06093: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11000 1726867139.06096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11000 1726867139.06098: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 11000 1726867139.06100: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff51106b0> import 'errno' # <<< 11000 1726867139.06103: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff5111d90> <<< 11000 1726867139.06109: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11000 1726867139.06115: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11000 1726867139.06163: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5112c30> <<< 11000 1726867139.06188: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867139.06220: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff5113290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5112180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11000 1726867139.06234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11000 1726867139.06320: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff5113d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5113440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50fa4e0> <<< 11000 1726867139.06369: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11000 1726867139.06383: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11000 1726867139.06399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11000 1726867139.06409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11000 1726867139.06453: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e63bc0> <<< 11000 1726867139.06558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11000 1726867139.06561: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8c410> <<< 11000 1726867139.06563: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867139.06566: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8c6e0> <<< 11000 1726867139.06579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 11000 1726867139.06589: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11000 1726867139.06629: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867139.06758: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8d010> <<< 11000 1726867139.06869: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8c8c0> <<< 11000 1726867139.06903: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e61d60> <<< 11000 1726867139.06918: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11000 1726867139.06937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11000 1726867139.06953: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11000 1726867139.06996: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8ee10> <<< 11000 1726867139.07010: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8db50> <<< 11000 1726867139.07087: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50fabd0> <<< 11000 1726867139.07109: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11000 1726867139.07129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11000 1726867139.07158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11000 1726867139.07189: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4ebb140> <<< 11000 1726867139.07245: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11000 1726867139.07260: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.07266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11000 1726867139.07292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11000 1726867139.07383: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4edb500> <<< 11000 1726867139.07388: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11000 1726867139.07394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11000 1726867139.07451: stdout chunk (state=3): >>>import 'ntpath' # <<< 11000 1726867139.07466: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 11000 1726867139.07488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f3c260> <<< 11000 1726867139.07536: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11000 1726867139.07542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11000 1726867139.07557: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11000 1726867139.07655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11000 1726867139.07670: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f3e9c0> <<< 11000 1726867139.07744: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f3c380> <<< 11000 1726867139.07780: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f05280> <<< 11000 1726867139.07888: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4d41370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4eda300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8fd40> <<< 11000 1726867139.08020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11000 1726867139.08025: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4ff4eda420> <<< 11000 1726867139.08490: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ljxydjs3/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11000 1726867139.08618: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4da6ff0> import '_typing' # <<< 11000 1726867139.09000: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4d85ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4d85040> <<< 11000 1726867139.09004: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.09006: stdout chunk (state=3): >>>import 'ansible' # <<< 11000 1726867139.09008: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.09010: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.09012: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 11000 1726867139.09014: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.10536: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.12547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4da4ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4dda990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4dda720> <<< 11000 1726867139.12554: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4dda030> <<< 11000 1726867139.12558: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11000 1726867139.12982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11000 1726867139.12986: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4dda480> <<< 11000 1726867139.12988: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4da7c80> import 'atexit' # <<< 11000 1726867139.12991: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4ddb710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4ddb950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11000 1726867139.12993: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 11000 1726867139.12995: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4ddbe90> <<< 11000 1726867139.13000: stdout chunk (state=3): >>>import 'pwd' # <<< 11000 1726867139.13002: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11000 1726867139.13004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11000 1726867139.13010: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff472dcd0> <<< 11000 1726867139.13012: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867139.13015: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff472f8c0> <<< 11000 1726867139.13017: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11000 1726867139.13018: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4730200> <<< 11000 1726867139.13076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11000 1726867139.13102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47313a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11000 1726867139.13119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11000 1726867139.13138: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11000 1726867139.13204: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4733e30> <<< 11000 1726867139.13229: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4ebb0b0> <<< 11000 1726867139.13267: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4732120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11000 1726867139.13582: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11000 1726867139.13586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 11000 1726867139.13588: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11000 1726867139.13590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11000 1726867139.13592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11000 1726867139.13594: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473be00> import '_tokenize' # <<< 11000 1726867139.13622: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473a900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473a660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11000 1726867139.13675: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473aba0> <<< 11000 1726867139.13705: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4732600> <<< 11000 1726867139.13747: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff477fef0> <<< 11000 1726867139.13808: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11000 1726867139.13834: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11000 1726867139.14008: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4781c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4781a00> <<< 11000 1726867139.14012: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11000 1726867139.14014: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4784200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4782300> <<< 11000 1726867139.14016: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11000 1726867139.14049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.14153: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11000 1726867139.14176: stdout chunk (state=3): >>>import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47879e0> <<< 11000 1726867139.14245: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47843b0> <<< 11000 1726867139.14520: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff47887a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4788860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4788b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47802f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867139.14538: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4614200> <<< 11000 1726867139.14856: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4615520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff478a990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff478bd40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff478a600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11000 1726867139.15001: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.15117: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.15139: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 11000 1726867139.15170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 11000 1726867139.15246: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.15371: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.15557: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.16489: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.17373: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11000 1726867139.17417: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.17470: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4619730> <<< 11000 1726867139.17597: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff461a480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff478b4d0> <<< 11000 1726867139.17625: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11000 1726867139.17708: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.17711: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.17725: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11000 1726867139.17824: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.18004: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff461abd0> <<< 11000 1726867139.18067: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.18459: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.18896: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.19044: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.19073: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.19111: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11000 1726867139.19131: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.19192: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.19310: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11000 1726867139.19324: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.19351: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.19405: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11000 1726867139.19740: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.19849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11000 1726867139.19949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # <<< 11000 1726867139.19988: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff461b770> <<< 11000 1726867139.20006: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.20418: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.20459: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11000 1726867139.20507: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.20615: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4626060> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46233b0> <<< 11000 1726867139.20654: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11000 1726867139.20723: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.20783: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.20811: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.20868: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.21100: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff470ea20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47fe6f0> <<< 11000 1726867139.21196: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4626150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4618050> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 11000 1726867139.21223: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.21310: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # <<< 11000 1726867139.21351: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11000 1726867139.21415: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.21604: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.21631: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.21655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11000 1726867139.21671: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.21749: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.21804: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.21829: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.22057: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11000 1726867139.22062: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.22218: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.22256: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.22313: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867139.22336: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11000 1726867139.22362: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11000 1726867139.22605: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11000 1726867139.22609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46ba1e0> <<< 11000 1726867139.22642: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff428ffb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff42943e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46a6b10> <<< 11000 1726867139.22655: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46bad50> <<< 11000 1726867139.22737: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46b88c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46b8a70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11000 1726867139.22800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11000 1726867139.22837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 11000 1726867139.22890: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4297140> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42969f0> <<< 11000 1726867139.22946: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4296bd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4295e20> <<< 11000 1726867139.23072: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11000 1726867139.23075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42972f0> <<< 11000 1726867139.23090: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11000 1726867139.23288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff42fddc0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4297da0> <<< 11000 1726867139.23295: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46b9280> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 11000 1726867139.23297: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 11000 1726867139.23550: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 11000 1726867139.23556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.23575: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 11000 1726867139.23617: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.23693: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.23726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 11000 1726867139.23741: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.24183: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 11000 1726867139.24187: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11000 1726867139.24545: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.24905: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11000 1726867139.24923: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.24979: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25017: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25057: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25103: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 11000 1726867139.25174: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25216: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 11000 1726867139.25326: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.25353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 11000 1726867139.25379: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25432: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 11000 1726867139.25514: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25593: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11000 1726867139.25751: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42ff590> <<< 11000 1726867139.25754: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11000 1726867139.25760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11000 1726867139.25791: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42fe960> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 11000 1726867139.25868: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.25920: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11000 1726867139.25971: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.26019: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.26112: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 11000 1726867139.26192: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.26262: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11000 1726867139.26280: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.26310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.26356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11000 1726867139.26406: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11000 1726867139.26468: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867139.26529: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4332030> <<< 11000 1726867139.26701: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff43203e0> <<< 11000 1726867139.26853: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 11000 1726867139.26856: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 11000 1726867139.26970: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.26998: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.27110: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.27300: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 11000 1726867139.27327: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.27347: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11000 1726867139.27358: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.27537: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.27541: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff40b1c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4322750> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 11000 1726867139.27582: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.27624: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 11000 1726867139.27776: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.27991: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 11000 1726867139.28027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.28213: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # <<< 11000 1726867139.28226: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 11000 1726867139.28300: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.28475: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.28480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.28594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 11000 1726867139.28774: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.28919: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.29502: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.29924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11000 1726867139.29937: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.30199: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 11000 1726867139.30227: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.30490: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.30642: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11000 1726867139.30657: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.30664: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.30672: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 11000 1726867139.30683: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.30728: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.30766: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 11000 1726867139.30773: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.31119: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.31166: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.31473: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 11000 1726867139.31479: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.31511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11000 1726867139.31517: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.31603: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.31657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11000 1726867139.31665: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.31689: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.31716: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 11000 1726867139.31722: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.32097: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 11000 1726867139.32282: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.32533: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available <<< 11000 1726867139.32554: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.32606: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11000 1726867139.32612: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.32650: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.32685: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 11000 1726867139.32694: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.32892: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 11000 1726867139.32924: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33002: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11000 1726867139.33036: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33040: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 11000 1726867139.33044: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33127: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33140: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11000 1726867139.33154: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33160: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33190: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33298: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.33411: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33764: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11000 1726867139.33771: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11000 1726867139.33941: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.33986: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.34034: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11000 1726867139.34040: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.34123: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.34139: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11000 1726867139.34146: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.34395: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.34402: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867139.34545: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11000 1726867139.34548: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11000 1726867139.34580: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867139.34775: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11000 1726867139.34801: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11000 1726867139.34810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11000 1726867139.35098: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff40df590> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff40dc830> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff40def90> <<< 11000 1726867139.46289: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff41246e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4125790> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4173740> <<< 11000 1726867139.46309: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff41275f0> <<< 11000 1726867139.46815: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11000 1726867139.46835: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame <<< 11000 1726867139.72418: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.3603515625, "5m": 0.23046875, "15m": 0.11083984375}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Z<<< 11000 1726867139.72705: stdout chunk (state=3): >>>a7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "18", "second": "59", "epoch": "1726867139", "epoch_int": "1726867139", "date": "2024-09-20", "time": "17:18:59", "iso8601_micro": "2024-09-20T21:18:59.356027Z", "iso8601": "2024-09-20T21:18:59Z", "iso8601_basic": "20240920T171859356027", "iso8601_basic_short": "20240920T171859", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chass<<< 11000 1726867139.72738: stdout chunk (state=3): >>>is_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 384, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793705984, "block_size": 4096, "block_total": 65519099, "block_available": 63914479, "block_used": 1604620, "inode_total": 131070960, "inode_available": 131029071, "inode_used": 41889, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11000 1726867139.73529: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type<<< 11000 1726867139.73628: stdout chunk (state=3): >>> # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs<<< 11000 1726867139.73633: stdout chunk (state=3): >>> # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat<<< 11000 1726867139.73636: stdout chunk (state=3): >>> # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types<<< 11000 1726867139.73664: stdout chunk (state=3): >>> # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum <<< 11000 1726867139.73693: stdout chunk (state=3): >>># cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect<<< 11000 1726867139.73749: stdout chunk (state=3): >>> # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile<<< 11000 1726867139.73768: stdout chunk (state=3): >>> # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing<<< 11000 1726867139.73841: stdout chunk (state=3): >>> # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json<<< 11000 1726867139.73845: stdout chunk (state=3): >>> # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize<<< 11000 1726867139.73897: stdout chunk (state=3): >>> # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string<<< 11000 1726867139.73903: stdout chunk (state=3): >>> # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters<<< 11000 1726867139.74036: stdout chunk (state=3): >>> # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace<<< 11000 1726867139.74050: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection<<< 11000 1726867139.74158: stdout chunk (state=3): >>> # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor <<< 11000 1726867139.74200: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11000 1726867139.74705: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11000 1726867139.74762: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11000 1726867139.74808: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii <<< 11000 1726867139.74827: stdout chunk (state=3): >>># destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 11000 1726867139.74872: stdout chunk (state=3): >>> # destroy ntpath <<< 11000 1726867139.74907: stdout chunk (state=3): >>># destroy importlib <<< 11000 1726867139.74920: stdout chunk (state=3): >>># destroy zipimport<<< 11000 1726867139.74982: stdout chunk (state=3): >>> # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder<<< 11000 1726867139.74987: stdout chunk (state=3): >>> # destroy json.encoder # destroy json.scanner<<< 11000 1726867139.74992: stdout chunk (state=3): >>> # destroy _json<<< 11000 1726867139.75029: stdout chunk (state=3): >>> # destroy grp # destroy encodings # destroy _locale<<< 11000 1726867139.75033: stdout chunk (state=3): >>> # destroy locale # destroy select # destroy _signal<<< 11000 1726867139.75054: stdout chunk (state=3): >>> # destroy _posixsubprocess <<< 11000 1726867139.75110: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selinux<<< 11000 1726867139.75141: stdout chunk (state=3): >>> # destroy shutil <<< 11000 1726867139.75164: stdout chunk (state=3): >>># destroy distro<<< 11000 1726867139.75167: stdout chunk (state=3): >>> <<< 11000 1726867139.75258: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 11000 1726867139.75265: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector<<< 11000 1726867139.75267: stdout chunk (state=3): >>> <<< 11000 1726867139.75295: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal<<< 11000 1726867139.75308: stdout chunk (state=3): >>> # destroy pickle<<< 11000 1726867139.75338: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle<<< 11000 1726867139.75373: stdout chunk (state=3): >>> <<< 11000 1726867139.75391: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors<<< 11000 1726867139.75423: stdout chunk (state=3): >>> # destroy shlex<<< 11000 1726867139.75436: stdout chunk (state=3): >>> # destroy fcntl # destroy datetime<<< 11000 1726867139.75484: stdout chunk (state=3): >>> # destroy subprocess<<< 11000 1726867139.75507: stdout chunk (state=3): >>> # destroy base64<<< 11000 1726867139.75518: stdout chunk (state=3): >>> # destroy _ssl <<< 11000 1726867139.75539: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux<<< 11000 1726867139.75575: stdout chunk (state=3): >>> # destroy getpass # destroy pwd # destroy termios<<< 11000 1726867139.75620: stdout chunk (state=3): >>> # destroy json<<< 11000 1726867139.75627: stdout chunk (state=3): >>> # destroy socket # destroy struct<<< 11000 1726867139.75663: stdout chunk (state=3): >>> <<< 11000 1726867139.75666: stdout chunk (state=3): >>># destroy glob # destroy fnmatch<<< 11000 1726867139.75712: stdout chunk (state=3): >>> # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile<<< 11000 1726867139.75715: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array<<< 11000 1726867139.75821: stdout chunk (state=3): >>> # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna <<< 11000 1726867139.75824: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser <<< 11000 1726867139.75851: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 11000 1726867139.75893: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 11000 1726867139.75926: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit<<< 11000 1726867139.75973: stdout chunk (state=3): >>> # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 11000 1726867139.75979: stdout chunk (state=3): >>> # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math<<< 11000 1726867139.76001: stdout chunk (state=3): >>> # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external<<< 11000 1726867139.76061: stdout chunk (state=3): >>> # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg<<< 11000 1726867139.76064: stdout chunk (state=3): >>> # cleanup[3] wiping re._parser # cleanup[3] wiping _sre<<< 11000 1726867139.76080: stdout chunk (state=3): >>> # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools<<< 11000 1726867139.76101: stdout chunk (state=3): >>> # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig<<< 11000 1726867139.76147: stdout chunk (state=3): >>> # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath<<< 11000 1726867139.76179: stdout chunk (state=3): >>> # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases<<< 11000 1726867139.76188: stdout chunk (state=3): >>> # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix<<< 11000 1726867139.76211: stdout chunk (state=3): >>> # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys<<< 11000 1726867139.76221: stdout chunk (state=3): >>> # cleanup[3] wiping builtins # destroy selinux._selinux<<< 11000 1726867139.76257: stdout chunk (state=3): >>> # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11000 1726867139.76488: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket<<< 11000 1726867139.76519: stdout chunk (state=3): >>> # destroy _collections <<< 11000 1726867139.76564: stdout chunk (state=3): >>># destroy platform # destroy _uuid<<< 11000 1726867139.76587: stdout chunk (state=3): >>> # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize<<< 11000 1726867139.76619: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib<<< 11000 1726867139.76630: stdout chunk (state=3): >>> <<< 11000 1726867139.76678: stdout chunk (state=3): >>># destroy copyreg<<< 11000 1726867139.76742: stdout chunk (state=3): >>> # destroy contextlib # destroy _typing <<< 11000 1726867139.76765: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse<<< 11000 1726867139.76862: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves<<< 11000 1726867139.76898: stdout chunk (state=3): >>> # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path<<< 11000 1726867139.77122: stdout chunk (state=3): >>> # clear sys.modules<<< 11000 1726867139.77126: stdout chunk (state=3): >>> # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 11000 1726867139.77159: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11000 1726867139.77225: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib<<< 11000 1726867139.77256: stdout chunk (state=3): >>> # destroy _operator<<< 11000 1726867139.77291: stdout chunk (state=3): >>> # destroy _sre # destroy _string # destroy re # destroy itertools<<< 11000 1726867139.77319: stdout chunk (state=3): >>> # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread<<< 11000 1726867139.77358: stdout chunk (state=3): >>> # clear sys.audit hooks<<< 11000 1726867139.77474: stdout chunk (state=3): >>> <<< 11000 1726867139.77850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867139.77861: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 11000 1726867139.78083: stderr chunk (state=3): >>><<< 11000 1726867139.78088: stdout chunk (state=3): >>><<< 11000 1726867139.78367: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff52184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff51e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff521aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff502d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff502dfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff506be00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff506bec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50a37d0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50a3e60> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5083ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50811f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5068fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50c3770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50c2390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5082090> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50c0bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f8800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5068230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff50f8cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f8b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff50f8ef0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5066d50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f9580> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50f9250> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50fa480> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff51106b0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff5111d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5112c30> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff5113290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5112180> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff5113d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff5113440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50fa4e0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e63bc0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8c6e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8d010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4e8da00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e61d60> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8ee10> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8db50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff50fabd0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4ebb140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4edb500> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f3c260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f3e9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f3c380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4f05280> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4d41370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4eda300> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4e8fd40> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4ff4eda420> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_ljxydjs3/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4da6ff0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4d85ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4d85040> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4da4ec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4dda990> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4dda720> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4dda030> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4dda480> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4da7c80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4ddb710> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4ddb950> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4ddbe90> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff472dcd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff472f8c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4730200> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47313a0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4733e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4ebb0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4732120> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473be00> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473a900> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473a660> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff473aba0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4732600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff477fef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47801d0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4781c40> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4781a00> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4784200> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4782300> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47879e0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47843b0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff47887a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4788860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4788b00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47802f0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4614200> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4615520> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff478a990> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff478bd40> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff478a600> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4619730> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff461a480> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff478b4d0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff461abd0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff461b770> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4626060> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46233b0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff470ea20> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff47fe6f0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4626150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4618050> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46ba1e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff428ffb0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff42943e0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46a6b10> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46bad50> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46b88c0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46b8a70> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4297140> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42969f0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4296bd0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4295e20> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42972f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff42fddc0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4297da0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff46b9280> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42ff590> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff42fe960> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff4332030> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff43203e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff40b1c40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4322750> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4ff40df590> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff40dc830> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff40def90> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff41246e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4125790> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff4173740> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4ff41275f0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.3603515625, "5m": 0.23046875, "15m": 0.11083984375}, "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "18", "second": "59", "epoch": "1726867139", "epoch_int": "1726867139", "date": "2024-09-20", "time": "17:18:59", "iso8601_micro": "2024-09-20T21:18:59.356027Z", "iso8601": "2024-09-20T21:18:59Z", "iso8601_basic": "20240920T171859356027", "iso8601_basic_short": "20240920T171859", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2963, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 568, "free": 2963}, "nocache": {"free": 3296, "used": 235}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 384, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793705984, "block_size": 4096, "block_total": 65519099, "block_available": 63914479, "block_used": 1604620, "inode_total": 131070960, "inode_available": 131029071, "inode_used": 41889, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_lsb": {}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11000 1726867139.82838: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867139.82853: _low_level_execute_command(): starting 11000 1726867139.82856: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867138.4165814-11025-92159888715437/ > /dev/null 2>&1 && sleep 0' 11000 1726867139.83725: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867139.84006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867139.84075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867139.84083: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867139.86107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867139.86233: stderr chunk (state=3): >>><<< 11000 1726867139.86237: stdout chunk (state=3): >>><<< 11000 1726867139.86239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867139.86241: handler run complete 11000 1726867139.86562: variable 'ansible_facts' from source: unknown 11000 1726867139.86697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867139.87366: variable 'ansible_facts' from source: unknown 11000 1726867139.87553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867139.87875: attempt loop complete, returning result 11000 1726867139.87889: _execute() done 11000 1726867139.87896: dumping result to json 11000 1726867139.87982: done dumping result, returning 11000 1726867139.88025: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-c734-026a-0000000000cd] 11000 1726867139.88064: sending task result for task 0affcac9-a3a5-c734-026a-0000000000cd 11000 1726867139.89372: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000cd 11000 1726867139.89375: WORKER PROCESS EXITING ok: [managed_node1] 11000 1726867139.89669: no more pending results, returning what we have 11000 1726867139.89671: results queue empty 11000 1726867139.89672: checking for any_errors_fatal 11000 1726867139.89673: done checking for any_errors_fatal 11000 1726867139.89674: checking for max_fail_percentage 11000 1726867139.89675: done checking for max_fail_percentage 11000 1726867139.89676: checking to see if all hosts have failed and the running result is not ok 11000 1726867139.89676: done checking to see if all hosts have failed 11000 1726867139.89814: getting the remaining hosts for this loop 11000 1726867139.89816: done getting the remaining hosts for this loop 11000 1726867139.89820: getting the next task for host managed_node1 11000 1726867139.89825: done getting next task for host managed_node1 11000 1726867139.89827: ^ task is: TASK: meta (flush_handlers) 11000 1726867139.89829: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867139.89833: getting variables 11000 1726867139.89834: in VariableManager get_vars() 11000 1726867139.89858: Calling all_inventory to load vars for managed_node1 11000 1726867139.89861: Calling groups_inventory to load vars for managed_node1 11000 1726867139.89864: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867139.89874: Calling all_plugins_play to load vars for managed_node1 11000 1726867139.89879: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867139.89883: Calling groups_plugins_play to load vars for managed_node1 11000 1726867139.90219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867139.90652: done with get_vars() 11000 1726867139.90783: done getting variables 11000 1726867139.90855: in VariableManager get_vars() 11000 1726867139.90865: Calling all_inventory to load vars for managed_node1 11000 1726867139.90867: Calling groups_inventory to load vars for managed_node1 11000 1726867139.90870: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867139.90875: Calling all_plugins_play to load vars for managed_node1 11000 1726867139.90981: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867139.90995: Calling groups_plugins_play to load vars for managed_node1 11000 1726867139.91250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867139.91769: done with get_vars() 11000 1726867139.91794: done queuing things up, now waiting for results queue to drain 11000 1726867139.91797: results queue empty 11000 1726867139.91797: checking for any_errors_fatal 11000 1726867139.91800: done checking for any_errors_fatal 11000 1726867139.91801: checking for max_fail_percentage 11000 1726867139.91802: done checking for max_fail_percentage 11000 1726867139.91803: checking to see if all hosts have failed and the running result is not ok 11000 1726867139.91804: done checking to see if all hosts have failed 11000 1726867139.91809: getting the remaining hosts for this loop 11000 1726867139.91810: done getting the remaining hosts for this loop 11000 1726867139.91814: getting the next task for host managed_node1 11000 1726867139.91818: done getting next task for host managed_node1 11000 1726867139.91821: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11000 1726867139.91822: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867139.91824: getting variables 11000 1726867139.91825: in VariableManager get_vars() 11000 1726867139.91833: Calling all_inventory to load vars for managed_node1 11000 1726867139.91835: Calling groups_inventory to load vars for managed_node1 11000 1726867139.91837: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867139.91842: Calling all_plugins_play to load vars for managed_node1 11000 1726867139.91844: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867139.91846: Calling groups_plugins_play to load vars for managed_node1 11000 1726867139.92212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867139.92624: done with get_vars() 11000 1726867139.92632: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:11 Friday 20 September 2024 17:18:59 -0400 (0:00:01.560) 0:00:01.571 ****** 11000 1726867139.92834: entering _queue_task() for managed_node1/include_tasks 11000 1726867139.92836: Creating lock for include_tasks 11000 1726867139.93699: worker is 1 (out of 1 available) 11000 1726867139.93710: exiting _queue_task() for managed_node1/include_tasks 11000 1726867139.93720: done queuing things up, now waiting for results queue to drain 11000 1726867139.93721: waiting for pending results... 11000 1726867139.94246: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 11000 1726867139.94252: in run() - task 0affcac9-a3a5-c734-026a-000000000006 11000 1726867139.94288: variable 'ansible_search_path' from source: unknown 11000 1726867139.94453: calling self._execute() 11000 1726867139.94526: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867139.94539: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867139.94573: variable 'omit' from source: magic vars 11000 1726867139.94996: _execute() done 11000 1726867139.95000: dumping result to json 11000 1726867139.95002: done dumping result, returning 11000 1726867139.95004: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [0affcac9-a3a5-c734-026a-000000000006] 11000 1726867139.95006: sending task result for task 0affcac9-a3a5-c734-026a-000000000006 11000 1726867139.95072: done sending task result for task 0affcac9-a3a5-c734-026a-000000000006 11000 1726867139.95075: WORKER PROCESS EXITING 11000 1726867139.95137: no more pending results, returning what we have 11000 1726867139.95142: in VariableManager get_vars() 11000 1726867139.95175: Calling all_inventory to load vars for managed_node1 11000 1726867139.95180: Calling groups_inventory to load vars for managed_node1 11000 1726867139.95184: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867139.95207: Calling all_plugins_play to load vars for managed_node1 11000 1726867139.95210: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867139.95213: Calling groups_plugins_play to load vars for managed_node1 11000 1726867139.95827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867139.96110: done with get_vars() 11000 1726867139.96118: variable 'ansible_search_path' from source: unknown 11000 1726867139.96248: we have included files to process 11000 1726867139.96250: generating all_blocks data 11000 1726867139.96251: done generating all_blocks data 11000 1726867139.96252: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11000 1726867139.96253: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11000 1726867139.96257: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11000 1726867139.97759: in VariableManager get_vars() 11000 1726867139.97776: done with get_vars() 11000 1726867139.97911: done processing included file 11000 1726867139.97914: iterating over new_blocks loaded from include file 11000 1726867139.97916: in VariableManager get_vars() 11000 1726867139.97927: done with get_vars() 11000 1726867139.97929: filtering new block on tags 11000 1726867139.97944: done filtering new block on tags 11000 1726867139.97947: in VariableManager get_vars() 11000 1726867139.97957: done with get_vars() 11000 1726867139.97959: filtering new block on tags 11000 1726867139.97975: done filtering new block on tags 11000 1726867139.98009: in VariableManager get_vars() 11000 1726867139.98025: done with get_vars() 11000 1726867139.98026: filtering new block on tags 11000 1726867139.98040: done filtering new block on tags 11000 1726867139.98042: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 11000 1726867139.98048: extending task lists for all hosts with included blocks 11000 1726867139.98099: done extending task lists 11000 1726867139.98100: done processing included files 11000 1726867139.98101: results queue empty 11000 1726867139.98102: checking for any_errors_fatal 11000 1726867139.98103: done checking for any_errors_fatal 11000 1726867139.98104: checking for max_fail_percentage 11000 1726867139.98105: done checking for max_fail_percentage 11000 1726867139.98105: checking to see if all hosts have failed and the running result is not ok 11000 1726867139.98106: done checking to see if all hosts have failed 11000 1726867139.98107: getting the remaining hosts for this loop 11000 1726867139.98108: done getting the remaining hosts for this loop 11000 1726867139.98110: getting the next task for host managed_node1 11000 1726867139.98232: done getting next task for host managed_node1 11000 1726867139.98235: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11000 1726867139.98237: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867139.98240: getting variables 11000 1726867139.98241: in VariableManager get_vars() 11000 1726867139.98250: Calling all_inventory to load vars for managed_node1 11000 1726867139.98252: Calling groups_inventory to load vars for managed_node1 11000 1726867139.98255: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867139.98260: Calling all_plugins_play to load vars for managed_node1 11000 1726867139.98262: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867139.98265: Calling groups_plugins_play to load vars for managed_node1 11000 1726867139.98607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867139.99026: done with get_vars() 11000 1726867139.99035: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 17:18:59 -0400 (0:00:00.064) 0:00:01.635 ****** 11000 1726867139.99233: entering _queue_task() for managed_node1/setup 11000 1726867140.00112: worker is 1 (out of 1 available) 11000 1726867140.00123: exiting _queue_task() for managed_node1/setup 11000 1726867140.00134: done queuing things up, now waiting for results queue to drain 11000 1726867140.00135: waiting for pending results... 11000 1726867140.00636: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 11000 1726867140.00641: in run() - task 0affcac9-a3a5-c734-026a-0000000000de 11000 1726867140.00644: variable 'ansible_search_path' from source: unknown 11000 1726867140.00843: variable 'ansible_search_path' from source: unknown 11000 1726867140.00846: calling self._execute() 11000 1726867140.00983: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867140.00989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867140.00992: variable 'omit' from source: magic vars 11000 1726867140.02099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867140.06728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867140.06922: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867140.07015: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867140.07128: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867140.07218: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867140.07483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867140.07502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867140.07542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867140.07732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867140.07735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867140.08098: variable 'ansible_facts' from source: unknown 11000 1726867140.08295: variable 'network_test_required_facts' from source: task vars 11000 1726867140.08338: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 11000 1726867140.08399: variable 'omit' from source: magic vars 11000 1726867140.08442: variable 'omit' from source: magic vars 11000 1726867140.08533: variable 'omit' from source: magic vars 11000 1726867140.08715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867140.08718: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867140.08720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867140.08722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867140.08783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867140.09041: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867140.09044: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867140.09047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867140.09155: Set connection var ansible_shell_type to sh 11000 1726867140.09182: Set connection var ansible_pipelining to False 11000 1726867140.09256: Set connection var ansible_shell_executable to /bin/sh 11000 1726867140.09260: Set connection var ansible_connection to ssh 11000 1726867140.09263: Set connection var ansible_timeout to 10 11000 1726867140.09266: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867140.09269: variable 'ansible_shell_executable' from source: unknown 11000 1726867140.09270: variable 'ansible_connection' from source: unknown 11000 1726867140.09272: variable 'ansible_module_compression' from source: unknown 11000 1726867140.09274: variable 'ansible_shell_type' from source: unknown 11000 1726867140.09276: variable 'ansible_shell_executable' from source: unknown 11000 1726867140.09280: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867140.09282: variable 'ansible_pipelining' from source: unknown 11000 1726867140.09284: variable 'ansible_timeout' from source: unknown 11000 1726867140.09514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867140.09652: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867140.09949: variable 'omit' from source: magic vars 11000 1726867140.09952: starting attempt loop 11000 1726867140.09954: running the handler 11000 1726867140.09957: _low_level_execute_command(): starting 11000 1726867140.09959: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867140.11391: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867140.11572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867140.11619: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.11728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.13409: stdout chunk (state=3): >>>/root <<< 11000 1726867140.13512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867140.13616: stderr chunk (state=3): >>><<< 11000 1726867140.13660: stdout chunk (state=3): >>><<< 11000 1726867140.13761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867140.13772: _low_level_execute_command(): starting 11000 1726867140.13788: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335 `" && echo ansible-tmp-1726867140.137419-11088-194151728025335="` echo /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335 `" ) && sleep 0' 11000 1726867140.15150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867140.15270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867140.15294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867140.15313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867140.15333: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867140.15347: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867140.15362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867140.15394: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867140.15487: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867140.15512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.15607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.17497: stdout chunk (state=3): >>>ansible-tmp-1726867140.137419-11088-194151728025335=/root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335 <<< 11000 1726867140.17697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867140.17701: stdout chunk (state=3): >>><<< 11000 1726867140.17704: stderr chunk (state=3): >>><<< 11000 1726867140.17991: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867140.137419-11088-194151728025335=/root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867140.17995: variable 'ansible_module_compression' from source: unknown 11000 1726867140.18022: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11000 1726867140.18090: variable 'ansible_facts' from source: unknown 11000 1726867140.18661: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/AnsiballZ_setup.py 11000 1726867140.19032: Sending initial data 11000 1726867140.19035: Sent initial data (153 bytes) 11000 1726867140.20202: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867140.20230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867140.20243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867140.20395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.20512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.22452: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867140.22509: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867140.22565: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpnvs5xjve /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/AnsiballZ_setup.py <<< 11000 1726867140.22568: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/AnsiballZ_setup.py" <<< 11000 1726867140.22656: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpnvs5xjve" to remote "/root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/AnsiballZ_setup.py" <<< 11000 1726867140.25535: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867140.25649: stderr chunk (state=3): >>><<< 11000 1726867140.25673: stdout chunk (state=3): >>><<< 11000 1726867140.25750: done transferring module to remote 11000 1726867140.25754: _low_level_execute_command(): starting 11000 1726867140.25757: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/ /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/AnsiballZ_setup.py && sleep 0' 11000 1726867140.26512: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867140.26674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867140.26703: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.26797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.29309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867140.29313: stdout chunk (state=3): >>><<< 11000 1726867140.29315: stderr chunk (state=3): >>><<< 11000 1726867140.29319: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867140.29322: _low_level_execute_command(): starting 11000 1726867140.29323: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/AnsiballZ_setup.py && sleep 0' 11000 1726867140.30729: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867140.30773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867140.30833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.30928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.33918: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 11000 1726867140.34097: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 11000 1726867140.34158: stdout chunk (state=3): >>>import 'codecs' # <<< 11000 1726867140.34180: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11000 1726867140.34345: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2ae04d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2aafb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2ae2a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # <<< 11000 1726867140.34547: stdout chunk (state=3): >>>import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 11000 1726867140.34575: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 11000 1726867140.34627: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 11000 1726867140.34632: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11000 1726867140.34689: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11000 1726867140.34706: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2891130> <<< 11000 1726867140.34793: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11000 1726867140.34816: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2891fa0> <<< 11000 1726867140.34842: stdout chunk (state=3): >>>import 'site' # <<< 11000 1726867140.34870: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11000 1726867140.35501: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11000 1726867140.35522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11000 1726867140.35549: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11000 1726867140.35563: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.35591: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11000 1726867140.35642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11000 1726867140.35668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11000 1726867140.35700: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11000 1726867140.35709: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cfe60> <<< 11000 1726867140.35738: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 11000 1726867140.35758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11000 1726867140.35796: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cff20> <<< 11000 1726867140.35816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11000 1726867140.35942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.35958: stdout chunk (state=3): >>>import 'itertools' # <<< 11000 1726867140.35997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2907890> <<< 11000 1726867140.36028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 11000 1726867140.36045: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 11000 1726867140.36052: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2907f20> <<< 11000 1726867140.36066: stdout chunk (state=3): >>>import '_collections' # <<< 11000 1726867140.36140: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28e7b30> <<< 11000 1726867140.36149: stdout chunk (state=3): >>>import '_functools' # <<< 11000 1726867140.36192: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28e5250> <<< 11000 1726867140.36320: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cd010> <<< 11000 1726867140.36447: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11000 1726867140.36457: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11000 1726867140.36471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11000 1726867140.36504: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2927800> <<< 11000 1726867140.36530: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2926450> <<< 11000 1726867140.36570: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' <<< 11000 1726867140.36573: stdout chunk (state=3): >>>import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28e6120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2924cb0> <<< 11000 1726867140.36649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11000 1726867140.36662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295c860> <<< 11000 1726867140.36667: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cc290> <<< 11000 1726867140.36964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc295cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc295cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295e5a0> import 'importlib.util' # import 'runpy' # <<< 11000 1726867140.37000: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11000 1726867140.37042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11000 1726867140.37076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 11000 1726867140.37098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc29747a0> import 'errno' # <<< 11000 1726867140.37142: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2975e80> <<< 11000 1726867140.37179: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11000 1726867140.37201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11000 1726867140.37242: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 11000 1726867140.37252: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2976d20> <<< 11000 1726867140.37295: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2977320> <<< 11000 1726867140.37315: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2976270> <<< 11000 1726867140.37350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11000 1726867140.37410: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.37413: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2977da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc29774d0> <<< 11000 1726867140.37475: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295e510> <<< 11000 1726867140.37502: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11000 1726867140.37534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11000 1726867140.37558: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11000 1726867140.37590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11000 1726867140.37631: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2667bf0> <<< 11000 1726867140.37662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11000 1726867140.37708: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc26906b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2690410> <<< 11000 1726867140.37951: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc26906e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.38053: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2691010> <<< 11000 1726867140.38231: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.38236: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc26919d0> <<< 11000 1726867140.38266: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26908c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2665d90> <<< 11000 1726867140.38296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11000 1726867140.38322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11000 1726867140.38348: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11000 1726867140.38381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2692d20> <<< 11000 1726867140.38405: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2690e60> <<< 11000 1726867140.38435: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295e750> <<< 11000 1726867140.38462: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11000 1726867140.38550: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.38573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11000 1726867140.38618: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11000 1726867140.38660: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26bf080> <<< 11000 1726867140.38733: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11000 1726867140.38746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.38781: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11000 1726867140.38803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11000 1726867140.38859: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26df440> <<< 11000 1726867140.39059: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2740260> <<< 11000 1726867140.39070: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11000 1726867140.39102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11000 1726867140.39133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11000 1726867140.39190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11000 1726867140.39314: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc27429c0> <<< 11000 1726867140.39431: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2740380> <<< 11000 1726867140.39490: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc270d250> <<< 11000 1726867140.39535: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2545340> <<< 11000 1726867140.39538: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26de240> <<< 11000 1726867140.39552: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2693c50> <<< 11000 1726867140.39820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11000 1726867140.39850: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4dc26de840> <<< 11000 1726867140.40242: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_maw2_5wu/ansible_setup_payload.zip' <<< 11000 1726867140.40245: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.40441: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.40496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11000 1726867140.40499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11000 1726867140.40546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11000 1726867140.40715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25aef90> import '_typing' # <<< 11000 1726867140.40897: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc258de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc258d070> <<< 11000 1726867140.40905: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.40939: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 11000 1726867140.41255: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11000 1726867140.43250: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.45115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 11000 1726867140.45119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25ace30> <<< 11000 1726867140.45148: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 11000 1726867140.45189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11000 1726867140.45214: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 11000 1726867140.45241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11000 1726867140.45254: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc25de900> <<< 11000 1726867140.45289: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25de690> <<< 11000 1726867140.45357: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25ddfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11000 1726867140.45424: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25de3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2ae29c0> import 'atexit' # <<< 11000 1726867140.45462: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc25df680> <<< 11000 1726867140.45508: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc25df800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11000 1726867140.45579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11000 1726867140.45660: stdout chunk (state=3): >>>import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25dfd10> <<< 11000 1726867140.45699: stdout chunk (state=3): >>>import 'pwd' # <<< 11000 1726867140.45702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11000 1726867140.45723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11000 1726867140.45794: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f2dac0> <<< 11000 1726867140.45827: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f2f6e0> <<< 11000 1726867140.45830: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11000 1726867140.45914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f2ff50> <<< 11000 1726867140.45926: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11000 1726867140.45963: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11000 1726867140.45973: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f311c0> <<< 11000 1726867140.46007: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 11000 1726867140.46046: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11000 1726867140.46151: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f33cb0> <<< 11000 1726867140.46199: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc27401d0> <<< 11000 1726867140.46230: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f31f70> <<< 11000 1726867140.46360: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11000 1726867140.46496: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11000 1726867140.46510: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11000 1726867140.46532: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3bc20> <<< 11000 1726867140.46748: stdout chunk (state=3): >>>import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3a6f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3a450> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11000 1726867140.46773: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3a9c0> <<< 11000 1726867140.46803: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f32480> <<< 11000 1726867140.46844: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f7fe90> <<< 11000 1726867140.46891: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 11000 1726867140.46916: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f7fef0> <<< 11000 1726867140.46959: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11000 1726867140.46964: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11000 1726867140.46976: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11000 1726867140.47022: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.47050: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f81ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f81880> <<< 11000 1726867140.47063: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11000 1726867140.47093: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11000 1726867140.47165: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f83fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f82120> <<< 11000 1726867140.47194: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11000 1726867140.47248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.47296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 11000 1726867140.47307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11000 1726867140.47364: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f87620> <<< 11000 1726867140.47563: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f840e0> <<< 11000 1726867140.47645: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f881a0> <<< 11000 1726867140.47848: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f88380> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f88860> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f801a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.47879: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f8bfb0> <<< 11000 1726867140.48132: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.48135: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.48162: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1e15280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f8a750> <<< 11000 1726867140.48207: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f8bb00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f8a3c0> <<< 11000 1726867140.48210: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.48255: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11000 1726867140.48259: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.48390: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.48518: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.48529: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 11000 1726867140.48567: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.48583: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 11000 1726867140.48613: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.48782: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.48958: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.49946: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.50763: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11000 1726867140.50804: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11000 1726867140.50820: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11000 1726867140.50842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.50910: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1e19280> <<< 11000 1726867140.51259: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e1a060> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f8b290> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11000 1726867140.51387: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.51635: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 11000 1726867140.51640: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11000 1726867140.51665: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e1a150> # zipimport: zlib available <<< 11000 1726867140.52397: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53129: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53236: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53360: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11000 1726867140.53363: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53411: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53455: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11000 1726867140.53481: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53574: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53727: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11000 1726867140.53731: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53746: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11000 1726867140.53781: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53808: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.53884: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11000 1726867140.53888: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.54248: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.54620: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11000 1726867140.54710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11000 1726867140.54723: stdout chunk (state=3): >>>import '_ast' # <<< 11000 1726867140.54833: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e1b320> <<< 11000 1726867140.54836: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.54943: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.55046: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 11000 1726867140.55261: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 11000 1726867140.55265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.55320: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.55406: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.55496: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11000 1726867140.55558: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.55674: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1e25ee0> <<< 11000 1726867140.55729: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e232f0> <<< 11000 1726867140.55780: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 11000 1726867140.55794: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.55881: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.55981: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.56005: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.56065: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.56104: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11000 1726867140.56121: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11000 1726867140.56148: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11000 1726867140.56227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11000 1726867140.56254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11000 1726867140.56275: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11000 1726867140.56365: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f0e840> <<< 11000 1726867140.56428: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc260a510> <<< 11000 1726867140.56652: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e25f70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e18590> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11000 1726867140.56690: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11000 1726867140.56725: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 11000 1726867140.56756: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.56835: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.56921: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.56949: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.56979: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.57033: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.57094: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.57142: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.57208: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11000 1726867140.57218: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.57546: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 11000 1726867140.57776: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.58046: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.58105: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.58195: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867140.58217: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11000 1726867140.58247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11000 1726867140.58251: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11000 1726867140.58309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11000 1726867140.58312: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb5d60> <<< 11000 1726867140.58342: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 11000 1726867140.58353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 11000 1726867140.58386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 11000 1726867140.58444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 11000 1726867140.58484: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 11000 1726867140.58488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 11000 1726867140.58517: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a97da0> <<< 11000 1726867140.58539: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.58564: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1a9c140> <<< 11000 1726867140.58633: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e9e6f0> <<< 11000 1726867140.58665: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb68d0> <<< 11000 1726867140.58737: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb4440> <<< 11000 1726867140.58753: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb5e50> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11000 1726867140.58820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 11000 1726867140.58824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 11000 1726867140.58840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 11000 1726867140.58864: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 11000 1726867140.59042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1a9f1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9ea80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1a9ec30> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9deb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11000 1726867140.59141: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11000 1726867140.59149: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9f320> <<< 11000 1726867140.59180: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11000 1726867140.59216: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11000 1726867140.59260: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.59267: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1af9e50> <<< 11000 1726867140.59300: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9fe30> <<< 11000 1726867140.59336: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb54f0> <<< 11000 1726867140.59345: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # <<< 11000 1726867140.59373: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11000 1726867140.59375: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.59443: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 11000 1726867140.59469: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.59549: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.59626: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11000 1726867140.59652: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.59728: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.59794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11000 1726867140.59814: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.59834: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11000 1726867140.59939: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 11000 1726867140.59998: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.60066: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 11000 1726867140.60079: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.60152: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.60195: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11000 1726867140.60208: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.60283: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.60365: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.60646: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11000 1726867140.61358: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.62283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 11000 1726867140.62300: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867140.62324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11000 1726867140.62336: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.62385: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.62511: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867140.62551: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 11000 1726867140.62566: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.62593: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11000 1726867140.62646: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.62688: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.62758: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 11000 1726867140.62779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 11000 1726867140.62792: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1afa870> <<< 11000 1726867140.62816: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11000 1726867140.62846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11000 1726867140.62986: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1afaa80> import 'ansible.module_utils.facts.system.local' # <<< 11000 1726867140.62995: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.63047: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.63113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11000 1726867140.63133: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.63324: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.63543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # <<< 11000 1726867140.63566: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.63672: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 11000 1726867140.63739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11000 1726867140.63829: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.63914: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1b3a1b0> <<< 11000 1726867140.64213: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1b29fd0> <<< 11000 1726867140.64222: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 11000 1726867140.64228: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.64310: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.64388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11000 1726867140.64394: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.64529: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.64653: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.64824: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.65041: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 11000 1726867140.65050: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.65144: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 11000 1726867140.65159: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.65210: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.65273: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 11000 1726867140.65283: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 11000 1726867140.65316: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.65335: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1b4dd00> <<< 11000 1726867140.65363: stdout chunk (state=3): >>>import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1b2b1d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 11000 1726867140.65401: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 11000 1726867140.65503: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 11000 1726867140.65507: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.65753: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.65981: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11000 1726867140.65996: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.66170: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.66251: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.66540: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.66543: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867140.66545: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.66676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11000 1726867140.66692: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.66806: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.66939: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11000 1726867140.66942: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.67028: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.67052: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.67553: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.68073: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 11000 1726867140.68115: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.68187: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.68461: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867140.68590: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11000 1726867140.68602: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.68897: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.69060: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11000 1726867140.69076: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.69098: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11000 1726867140.69117: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.69223: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 11000 1726867140.69238: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.69388: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.69547: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.69843: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70157: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11000 1726867140.70213: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867140.70266: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 11000 1726867140.70281: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70307: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70333: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 11000 1726867140.70373: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70449: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70556: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 11000 1726867140.70615: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 11000 1726867140.70620: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70706: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70795: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 11000 1726867140.70945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.70953: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11000 1726867140.70966: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.71275: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.71536: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 11000 1726867140.71559: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.71610: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.71672: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11000 1726867140.71675: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.71898: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 11000 1726867140.71924: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 11000 1726867140.72045: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72050: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72164: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11000 1726867140.72186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72200: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72210: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 11000 1726867140.72228: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72282: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72340: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 11000 1726867140.72353: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72376: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72403: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72465: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72536: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72638: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72732: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11000 1726867140.72763: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 11000 1726867140.72829: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.72908: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11000 1726867140.73265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.73514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11000 1726867140.73553: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.73586: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.73649: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11000 1726867140.73724: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867140.73791: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 11000 1726867140.73797: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.73915: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.74035: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 11000 1726867140.74045: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 11000 1726867140.74057: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.74184: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.74342: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11000 1726867140.74348: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11000 1726867140.74422: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867140.75514: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 11000 1726867140.75517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11000 1726867140.75537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 11000 1726867140.75584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11000 1726867140.75591: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867140.75614: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc194b3e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1948b60> <<< 11000 1726867140.75662: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc194ad20> <<< 11000 1726867140.76099: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "00", "epoch": "1726867140", "epoch_int": "1726867140", "date": "2024-09-20", "time": "17:19:00", "iso8601_micro": "2024-09-20T21:19:00.753136Z", "iso8601": "2024-09-20T21:19:00Z", "iso8601_basic": "20240920T171900753136", "iso8601_basic_short": "20240920T171900", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11000 1726867140.76662: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 11000 1726867140.76693: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs <<< 11000 1726867140.76723: stdout chunk (state=3): >>># cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword <<< 11000 1726867140.76735: stdout chunk (state=3): >>># cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery <<< 11000 1726867140.76817: stdout chunk (state=3): >>># cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 11000 1726867140.76824: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale <<< 11000 1726867140.76826: stdout chunk (state=3): >>># cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid <<< 11000 1726867140.76854: stdout chunk (state=3): >>># cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale <<< 11000 1726867140.76871: stdout chunk (state=3): >>># destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ <<< 11000 1726867140.76906: stdout chunk (state=3): >>># destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local <<< 11000 1726867140.76961: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter <<< 11000 1726867140.76965: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware <<< 11000 1726867140.76980: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 11000 1726867140.77316: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11000 1726867140.77320: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11000 1726867140.77349: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11000 1726867140.77376: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 11000 1726867140.77395: stdout chunk (state=3): >>># destroy ntpath <<< 11000 1726867140.77435: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 11000 1726867140.77480: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select <<< 11000 1726867140.77483: stdout chunk (state=3): >>># destroy _signal # destroy _posixsubprocess <<< 11000 1726867140.77505: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 11000 1726867140.77551: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 11000 1726867140.77554: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11000 1726867140.77605: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 11000 1726867140.77648: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 11000 1726867140.77689: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl <<< 11000 1726867140.77717: stdout chunk (state=3): >>># destroy datetime # destroy subprocess # destroy base64 <<< 11000 1726867140.77731: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 11000 1726867140.77774: stdout chunk (state=3): >>># destroy errno # destroy json # destroy socket # destroy struct <<< 11000 1726867140.77790: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 11000 1726867140.77876: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 11000 1726867140.77884: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 11000 1726867140.77923: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 11000 1726867140.77979: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 11000 1726867140.78006: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 11000 1726867140.78016: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11000 1726867140.78157: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 11000 1726867140.78216: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 11000 1726867140.78231: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 11000 1726867140.78274: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 11000 1726867140.78315: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11000 1726867140.78411: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11000 1726867140.78491: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 11000 1726867140.78498: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 11000 1726867140.78525: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11000 1726867140.78955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867140.78958: stdout chunk (state=3): >>><<< 11000 1726867140.78961: stderr chunk (state=3): >>><<< 11000 1726867140.79302: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2ae04d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2aafb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2ae2a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2891130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2891fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cfe60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cff20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2907890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2907f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28e7b30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28e5250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cd010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2927800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2926450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28e6120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2924cb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295c860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cc290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc295cd10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295cbc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc295cfb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc28cadb0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295d6a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295d370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295e5a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc29747a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2975e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2976d20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2977320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2976270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2977da0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc29774d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295e510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2667bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc26906b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2690410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc26906e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc2691010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc26919d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26908c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2665d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2692d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2690e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc295e750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26bf080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26df440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2740260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc27429c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2740380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc270d250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2545340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc26de240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2693c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4dc26de840> # zipimport: found 103 names in '/tmp/ansible_setup_payload_maw2_5wu/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25aef90> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc258de80> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc258d070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25ace30> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc25de900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25de690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25ddfa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25de3f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc2ae29c0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc25df680> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc25df800> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc25dfd10> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f2dac0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f2f6e0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f2ff50> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f311c0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f33cb0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc27401d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f31f70> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3bc20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3a6f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3a450> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f3a9c0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f32480> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f7fe90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f7fef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f81ac0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f81880> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f83fb0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f82120> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f87620> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f840e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f881a0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f88380> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f88860> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f801a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f8bfb0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1e15280> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f8a750> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1f8bb00> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f8a3c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1e19280> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e1a060> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f8b290> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e1a150> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e1b320> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1e25ee0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e232f0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1f0e840> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc260a510> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e25f70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e18590> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb5d60> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a97da0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1a9c140> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1e9e6f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb68d0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb4440> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb5e50> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1a9f1d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9ea80> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1a9ec30> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9deb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9f320> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1af9e50> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1a9fe30> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1eb54f0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1afa870> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1afaa80> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1b3a1b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1b29fd0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc1b4dd00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1b2b1d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4dc194b3e0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc1948b60> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4dc194ad20> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "00", "epoch": "1726867140", "epoch_int": "1726867140", "date": "2024-09-20", "time": "17:19:00", "iso8601_micro": "2024-09-20T21:19:00.753136Z", "iso8601": "2024-09-20T21:19:00Z", "iso8601_basic": "20240920T171900753136", "iso8601_basic_short": "20240920T171900", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_fips": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11000 1726867140.80294: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867140.80297: _low_level_execute_command(): starting 11000 1726867140.80300: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867140.137419-11088-194151728025335/ > /dev/null 2>&1 && sleep 0' 11000 1726867140.80613: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867140.80693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867140.80701: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867140.80714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.80792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.82640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867140.82646: stderr chunk (state=3): >>><<< 11000 1726867140.82664: stdout chunk (state=3): >>><<< 11000 1726867140.82701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867140.82714: handler run complete 11000 1726867140.82869: variable 'ansible_facts' from source: unknown 11000 1726867140.82873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867140.82966: variable 'ansible_facts' from source: unknown 11000 1726867140.83034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867140.83110: attempt loop complete, returning result 11000 1726867140.83118: _execute() done 11000 1726867140.83124: dumping result to json 11000 1726867140.83139: done dumping result, returning 11000 1726867140.83152: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcac9-a3a5-c734-026a-0000000000de] 11000 1726867140.83162: sending task result for task 0affcac9-a3a5-c734-026a-0000000000de 11000 1726867140.83700: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000de 11000 1726867140.83703: WORKER PROCESS EXITING ok: [managed_node1] 11000 1726867140.83815: no more pending results, returning what we have 11000 1726867140.83818: results queue empty 11000 1726867140.83819: checking for any_errors_fatal 11000 1726867140.83821: done checking for any_errors_fatal 11000 1726867140.83821: checking for max_fail_percentage 11000 1726867140.83823: done checking for max_fail_percentage 11000 1726867140.83824: checking to see if all hosts have failed and the running result is not ok 11000 1726867140.83824: done checking to see if all hosts have failed 11000 1726867140.83825: getting the remaining hosts for this loop 11000 1726867140.83826: done getting the remaining hosts for this loop 11000 1726867140.83830: getting the next task for host managed_node1 11000 1726867140.83838: done getting next task for host managed_node1 11000 1726867140.83840: ^ task is: TASK: Check if system is ostree 11000 1726867140.83843: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867140.83846: getting variables 11000 1726867140.83848: in VariableManager get_vars() 11000 1726867140.83874: Calling all_inventory to load vars for managed_node1 11000 1726867140.83884: Calling groups_inventory to load vars for managed_node1 11000 1726867140.83895: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867140.83907: Calling all_plugins_play to load vars for managed_node1 11000 1726867140.83910: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867140.83913: Calling groups_plugins_play to load vars for managed_node1 11000 1726867140.84164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867140.84389: done with get_vars() 11000 1726867140.84401: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 17:19:00 -0400 (0:00:00.852) 0:00:02.488 ****** 11000 1726867140.84505: entering _queue_task() for managed_node1/stat 11000 1726867140.84835: worker is 1 (out of 1 available) 11000 1726867140.84847: exiting _queue_task() for managed_node1/stat 11000 1726867140.84868: done queuing things up, now waiting for results queue to drain 11000 1726867140.84870: waiting for pending results... 11000 1726867140.85134: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 11000 1726867140.85255: in run() - task 0affcac9-a3a5-c734-026a-0000000000e0 11000 1726867140.85275: variable 'ansible_search_path' from source: unknown 11000 1726867140.85284: variable 'ansible_search_path' from source: unknown 11000 1726867140.85336: calling self._execute() 11000 1726867140.85427: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867140.85439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867140.85453: variable 'omit' from source: magic vars 11000 1726867140.85998: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867140.86294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867140.86303: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867140.86339: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867140.86410: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867140.86495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867140.86582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867140.86588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867140.86590: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867140.86730: Evaluated conditional (not __network_is_ostree is defined): True 11000 1726867140.86741: variable 'omit' from source: magic vars 11000 1726867140.86782: variable 'omit' from source: magic vars 11000 1726867140.86833: variable 'omit' from source: magic vars 11000 1726867140.86865: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867140.86901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867140.86924: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867140.86956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867140.86971: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867140.87008: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867140.87016: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867140.87024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867140.87164: Set connection var ansible_shell_type to sh 11000 1726867140.87168: Set connection var ansible_pipelining to False 11000 1726867140.87170: Set connection var ansible_shell_executable to /bin/sh 11000 1726867140.87172: Set connection var ansible_connection to ssh 11000 1726867140.87174: Set connection var ansible_timeout to 10 11000 1726867140.87191: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867140.87222: variable 'ansible_shell_executable' from source: unknown 11000 1726867140.87273: variable 'ansible_connection' from source: unknown 11000 1726867140.87276: variable 'ansible_module_compression' from source: unknown 11000 1726867140.87280: variable 'ansible_shell_type' from source: unknown 11000 1726867140.87282: variable 'ansible_shell_executable' from source: unknown 11000 1726867140.87284: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867140.87287: variable 'ansible_pipelining' from source: unknown 11000 1726867140.87289: variable 'ansible_timeout' from source: unknown 11000 1726867140.87291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867140.87417: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867140.87432: variable 'omit' from source: magic vars 11000 1726867140.87492: starting attempt loop 11000 1726867140.87496: running the handler 11000 1726867140.87498: _low_level_execute_command(): starting 11000 1726867140.87500: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867140.88311: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867140.88369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867140.88433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867140.88445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867140.88474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.88560: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.90437: stdout chunk (state=3): >>>/root <<< 11000 1726867140.90583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867140.90616: stderr chunk (state=3): >>><<< 11000 1726867140.90618: stdout chunk (state=3): >>><<< 11000 1726867140.90636: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867140.90680: _low_level_execute_command(): starting 11000 1726867140.90685: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277 `" && echo ansible-tmp-1726867140.9064562-11122-132724654650277="` echo /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277 `" ) && sleep 0' 11000 1726867140.91075: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867140.91080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867140.91082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867140.91087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867140.91137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867140.91143: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867140.91194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867140.93507: stdout chunk (state=3): >>>ansible-tmp-1726867140.9064562-11122-132724654650277=/root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277 <<< 11000 1726867140.93641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867140.93667: stderr chunk (state=3): >>><<< 11000 1726867140.93669: stdout chunk (state=3): >>><<< 11000 1726867140.93684: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867140.9064562-11122-132724654650277=/root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867140.93730: variable 'ansible_module_compression' from source: unknown 11000 1726867140.93769: ANSIBALLZ: Using lock for stat 11000 1726867140.93772: ANSIBALLZ: Acquiring lock 11000 1726867140.93775: ANSIBALLZ: Lock acquired: 139984830863440 11000 1726867140.93779: ANSIBALLZ: Creating module 11000 1726867141.03787: ANSIBALLZ: Writing module into payload 11000 1726867141.03792: ANSIBALLZ: Writing module 11000 1726867141.03795: ANSIBALLZ: Renaming module 11000 1726867141.03798: ANSIBALLZ: Done creating module 11000 1726867141.03800: variable 'ansible_facts' from source: unknown 11000 1726867141.03821: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/AnsiballZ_stat.py 11000 1726867141.04000: Sending initial data 11000 1726867141.04008: Sent initial data (153 bytes) 11000 1726867141.04562: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867141.04581: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867141.04602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867141.04625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867141.04637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867141.04730: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867141.04752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867141.04770: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.04868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.07179: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867141.07253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867141.07313: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpwz7d5cly /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/AnsiballZ_stat.py <<< 11000 1726867141.07332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/AnsiballZ_stat.py" <<< 11000 1726867141.07392: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpwz7d5cly" to remote "/root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/AnsiballZ_stat.py" <<< 11000 1726867141.08240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867141.08328: stderr chunk (state=3): >>><<< 11000 1726867141.08348: stdout chunk (state=3): >>><<< 11000 1726867141.08434: done transferring module to remote 11000 1726867141.08452: _low_level_execute_command(): starting 11000 1726867141.08462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/ /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/AnsiballZ_stat.py && sleep 0' 11000 1726867141.09237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867141.09279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11000 1726867141.09322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867141.09383: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867141.09399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.09546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867141.09563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.09682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.12329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867141.12340: stdout chunk (state=3): >>><<< 11000 1726867141.12367: stderr chunk (state=3): >>><<< 11000 1726867141.12475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867141.12482: _low_level_execute_command(): starting 11000 1726867141.12487: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/AnsiballZ_stat.py && sleep 0' 11000 1726867141.13029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867141.13051: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.13142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.16561: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # <<< 11000 1726867141.16600: stdout chunk (state=3): >>>import 'posix' # <<< 11000 1726867141.16640: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook import 'time' # <<< 11000 1726867141.16663: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 11000 1726867141.16717: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867141.16748: stdout chunk (state=3): >>>import '_codecs' # <<< 11000 1726867141.16774: stdout chunk (state=3): >>>import 'codecs' # <<< 11000 1726867141.16821: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11000 1726867141.16857: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11000 1726867141.16876: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fbbb84d0> <<< 11000 1726867141.16942: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fbb87b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fbbbaa50> import '_signal' # <<< 11000 1726867141.16975: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11000 1726867141.17003: stdout chunk (state=3): >>>import 'io' # <<< 11000 1726867141.17045: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11000 1726867141.17190: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11000 1726867141.17255: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11000 1726867141.17289: stdout chunk (state=3): >>>import 'os' # <<< 11000 1726867141.17301: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11000 1726867141.17358: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11000 1726867141.17369: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11000 1726867141.17405: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb969130> <<< 11000 1726867141.17472: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11000 1726867141.17495: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867141.17506: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb969fa0> <<< 11000 1726867141.17555: stdout chunk (state=3): >>>import 'site' # <<< 11000 1726867141.17564: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11000 1726867141.18151: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a7e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11000 1726867141.18194: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a7f20> <<< 11000 1726867141.18209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11000 1726867141.18247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11000 1726867141.18284: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11000 1726867141.18351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867141.18373: stdout chunk (state=3): >>>import 'itertools' # <<< 11000 1726867141.18412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9df890> <<< 11000 1726867141.18456: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 11000 1726867141.18460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 11000 1726867141.18483: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9dff20> import '_collections' # <<< 11000 1726867141.18555: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9bfb30> <<< 11000 1726867141.18578: stdout chunk (state=3): >>>import '_functools' # <<< 11000 1726867141.18604: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9bd250> <<< 11000 1726867141.18773: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a5010> <<< 11000 1726867141.18803: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 11000 1726867141.18847: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11000 1726867141.18905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 11000 1726867141.18970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9ff800> <<< 11000 1726867141.18978: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9fe450> <<< 11000 1726867141.19016: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9be120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9fccb0> <<< 11000 1726867141.19257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba34860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a4290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba34d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba34bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba34fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a2db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11000 1726867141.19313: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11000 1726867141.19318: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba356a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba35370> import 'importlib.machinery' # <<< 11000 1726867141.19373: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11000 1726867141.19407: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba365a0> <<< 11000 1726867141.19424: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11000 1726867141.19460: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11000 1726867141.19499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11000 1726867141.19545: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 11000 1726867141.19548: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4c7a0> <<< 11000 1726867141.19596: stdout chunk (state=3): >>>import 'errno' # <<< 11000 1726867141.19620: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba4de80> <<< 11000 1726867141.19647: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11000 1726867141.19701: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 11000 1726867141.19713: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4ed20> <<< 11000 1726867141.19845: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba4f320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4e270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba4fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4f4d0> <<< 11000 1726867141.19918: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba36510> <<< 11000 1726867141.19933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11000 1726867141.19957: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11000 1726867141.19982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11000 1726867141.20013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11000 1726867141.20049: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7cfbf0> <<< 11000 1726867141.20079: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11000 1726867141.20117: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7f8410> <<< 11000 1726867141.20360: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f86e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.20548: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f9010> <<< 11000 1726867141.20604: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f99d0> <<< 11000 1726867141.20630: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7f88c0> <<< 11000 1726867141.20640: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7cdd90> <<< 11000 1726867141.20675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11000 1726867141.20699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11000 1726867141.20728: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11000 1726867141.20753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11000 1726867141.20763: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7fad20> <<< 11000 1726867141.20804: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7f8e60> <<< 11000 1726867141.20820: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba36750> <<< 11000 1726867141.20853: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11000 1726867141.21158: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb827080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11000 1726867141.21189: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11000 1726867141.21258: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb847440> <<< 11000 1726867141.21272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11000 1726867141.21337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11000 1726867141.21407: stdout chunk (state=3): >>>import 'ntpath' # <<< 11000 1726867141.21447: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb8a8260> <<< 11000 1726867141.21484: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11000 1726867141.21514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11000 1726867141.21547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11000 1726867141.21604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11000 1726867141.21730: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb8aa9c0> <<< 11000 1726867141.21843: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb8a8380> <<< 11000 1726867141.21921: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb875250> <<< 11000 1726867141.21933: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb129370> <<< 11000 1726867141.21965: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb846240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7fbc50> <<< 11000 1726867141.22142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11000 1726867141.22170: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe1fb846840> <<< 11000 1726867141.22370: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_v1k4kxj9/ansible_stat_payload.zip' # zipimport: zlib available <<< 11000 1726867141.22580: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.22660: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11000 1726867141.22679: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11000 1726867141.22770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11000 1726867141.22815: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb17efc0> <<< 11000 1726867141.22846: stdout chunk (state=3): >>>import '_typing' # <<< 11000 1726867141.23110: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb15deb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb15d070> <<< 11000 1726867141.23158: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.23168: stdout chunk (state=3): >>>import 'ansible' # <<< 11000 1726867141.23347: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11000 1726867141.25458: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.27359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb17ce90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867141.27425: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 11000 1726867141.27445: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11000 1726867141.27475: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb1aa930> <<< 11000 1726867141.27535: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1aa6c0> <<< 11000 1726867141.27570: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1a9fd0> <<< 11000 1726867141.27605: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 11000 1726867141.27609: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 11000 1726867141.27661: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1aa420> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb17fc50> <<< 11000 1726867141.27681: stdout chunk (state=3): >>>import 'atexit' # <<< 11000 1726867141.27706: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.27846: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.27855: stdout chunk (state=3): >>>import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb1ab6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb1ab8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 11000 1726867141.27870: stdout chunk (state=3): >>>import '_locale' # <<< 11000 1726867141.27937: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1abe00> <<< 11000 1726867141.27939: stdout chunk (state=3): >>>import 'pwd' # <<< 11000 1726867141.27971: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11000 1726867141.28000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11000 1726867141.28054: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb011a60> <<< 11000 1726867141.28089: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.28097: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb013710> <<< 11000 1726867141.28116: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11000 1726867141.28140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11000 1726867141.28187: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb0140e0> <<< 11000 1726867141.28211: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11000 1726867141.28246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11000 1726867141.28344: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb014fe0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11000 1726867141.28357: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 11000 1726867141.28366: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11000 1726867141.28437: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb017d40> <<< 11000 1726867141.28481: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7fac90> <<< 11000 1726867141.28508: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb016000> <<< 11000 1726867141.28534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 11000 1726867141.28569: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 11000 1726867141.28742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01fb90> import '_tokenize' # <<< 11000 1726867141.28774: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01e660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01e3f0> <<< 11000 1726867141.28805: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 11000 1726867141.28808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11000 1726867141.28923: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01e930> <<< 11000 1726867141.28960: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb016480> <<< 11000 1726867141.28999: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.29004: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb067da0> <<< 11000 1726867141.29035: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 11000 1726867141.29042: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb067fb0> <<< 11000 1726867141.29073: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 11000 1726867141.29097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 11000 1726867141.29124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11000 1726867141.29178: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0699a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb069760> <<< 11000 1726867141.29232: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11000 1726867141.29413: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11000 1726867141.29656: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb06bf20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb06a090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb06f680> <<< 11000 1726867141.29835: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb06bf50> <<< 11000 1726867141.29921: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb070440> <<< 11000 1726867141.29949: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.29954: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0706e0> <<< 11000 1726867141.30006: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0709e0> <<< 11000 1726867141.30025: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb068050> <<< 11000 1726867141.30058: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 11000 1726867141.30091: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 11000 1726867141.30121: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11000 1726867141.30160: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.30200: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0fc1a0> <<< 11000 1726867141.30445: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.30461: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0fd610> <<< 11000 1726867141.30471: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb072930> <<< 11000 1726867141.30507: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 11000 1726867141.30511: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb073ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb072540> <<< 11000 1726867141.30532: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.30556: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11000 1726867141.30584: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.30738: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.30833: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.30859: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.30862: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 11000 1726867141.30871: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.30940: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11000 1726867141.31082: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.31255: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.32168: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.33034: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11000 1726867141.33176: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1faf01820> <<< 11000 1726867141.33289: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11000 1726867141.33301: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf02630> <<< 11000 1726867141.33328: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb0fd8e0> <<< 11000 1726867141.33397: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 11000 1726867141.33400: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.33444: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 11000 1726867141.33447: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.33689: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.33921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 11000 1726867141.33924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11000 1726867141.33960: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf023f0> # zipimport: zlib available <<< 11000 1726867141.34747: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.35501: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11000 1726867141.35606: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 11000 1726867141.35746: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11000 1726867141.35817: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.36002: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11000 1726867141.36051: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.36103: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11000 1726867141.36124: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.36480: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.36837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11000 1726867141.36927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11000 1726867141.36941: stdout chunk (state=3): >>>import '_ast' # <<< 11000 1726867141.37206: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf03800> # zipimport: zlib available <<< 11000 1726867141.37255: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11000 1726867141.37286: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.37336: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.37384: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 11000 1726867141.37407: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.37511: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.37525: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.37595: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.37702: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11000 1726867141.37795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11000 1726867141.37892: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1faf0e0f0> <<< 11000 1726867141.38051: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf0b530> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11000 1726867141.38075: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.38186: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.38203: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.38299: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11000 1726867141.38317: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11000 1726867141.38341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11000 1726867141.38420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11000 1726867141.38454: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11000 1726867141.38472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11000 1726867141.38655: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1fa9f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1ee6c0> <<< 11000 1726867141.38950: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf0deb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf02300> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 11000 1726867141.39151: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.39437: stdout chunk (state=3): >>># zipimport: zlib available <<< 11000 1726867141.39568: stdout chunk (state=3): >>> <<< 11000 1726867141.39659: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 11000 1726867141.39688: stdout chunk (state=3): >>># destroy __main__ <<< 11000 1726867141.40189: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys <<< 11000 1726867141.40207: stdout chunk (state=3): >>># cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io<<< 11000 1726867141.40258: stdout chunk (state=3): >>> # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins <<< 11000 1726867141.40310: stdout chunk (state=3): >>># cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib<<< 11000 1726867141.40350: stdout chunk (state=3): >>> # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 <<< 11000 1726867141.40372: stdout chunk (state=3): >>># cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 11000 1726867141.40650: stdout chunk (state=3): >>># cleanup[2] removing _json <<< 11000 1726867141.40687: stdout chunk (state=3): >>># cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11000 1726867141.40781: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 11000 1726867141.40788: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 11000 1726867141.40843: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 11000 1726867141.40854: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib<<< 11000 1726867141.40891: stdout chunk (state=3): >>> # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath <<< 11000 1726867141.40922: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder<<< 11000 1726867141.40971: stdout chunk (state=3): >>> # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 11000 1726867141.40995: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess <<< 11000 1726867141.41175: stdout chunk (state=3): >>># destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 11000 1726867141.41211: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 11000 1726867141.41261: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 11000 1726867141.41280: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 11000 1726867141.41327: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11000 1726867141.41511: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11000 1726867141.41534: stdout chunk (state=3): >>># destroy _socket <<< 11000 1726867141.41775: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11000 1726867141.41803: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11000 1726867141.41836: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 11000 1726867141.41884: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator <<< 11000 1726867141.41887: stdout chunk (state=3): >>># destroy _string # destroy re <<< 11000 1726867141.41908: stdout chunk (state=3): >>># destroy itertools <<< 11000 1726867141.41934: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11000 1726867141.42398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867141.42436: stderr chunk (state=3): >>><<< 11000 1726867141.42463: stdout chunk (state=3): >>><<< 11000 1726867141.42593: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fbbb84d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fbb87b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fbbbaa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb969130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb969fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a7e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a7f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9df890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9dff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9bfb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9bd250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a5010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9ff800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9fe450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9be120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9fccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba34860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a4290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba34d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba34bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba34fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb9a2db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba356a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba35370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba365a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4c7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba4de80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4ed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba4f320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4e270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fba4fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba4f4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba36510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7cfbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7f8410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f86e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f9010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7f99d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7f88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7cdd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7fad20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7f8e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fba36750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb827080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb847440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb8a8260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb8aa9c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb8a8380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb875250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb129370> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb846240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb7fbc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fe1fb846840> # zipimport: found 30 names in '/tmp/ansible_stat_payload_v1k4kxj9/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb17efc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb15deb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb15d070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb17ce90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb1aa930> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1aa6c0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1a9fd0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1aa420> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb17fc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb1ab6b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb1ab8f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1abe00> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb011a60> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb013710> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb0140e0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb014fe0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb017d40> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb7fac90> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb016000> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01fb90> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01e660> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01e3f0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb01e930> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb016480> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb067da0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb067fb0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0699a0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb069760> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb06bf20> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb06a090> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb06f680> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb06bf50> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb070440> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0706e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0709e0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb068050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0fc1a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb0fd610> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb072930> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1fb073ce0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb072540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1faf01820> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf02630> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb0fd8e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf023f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf03800> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fe1faf0e0f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf0b530> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1fa9f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1fb1ee6c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf0deb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fe1faf02300> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11000 1726867141.43290: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867141.43293: _low_level_execute_command(): starting 11000 1726867141.43296: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867140.9064562-11122-132724654650277/ > /dev/null 2>&1 && sleep 0' 11000 1726867141.43780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867141.43798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.43893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.46448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867141.46519: stderr chunk (state=3): >>><<< 11000 1726867141.46528: stdout chunk (state=3): >>><<< 11000 1726867141.46550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867141.46561: handler run complete 11000 1726867141.46591: attempt loop complete, returning result 11000 1726867141.46611: _execute() done 11000 1726867141.46708: dumping result to json 11000 1726867141.46712: done dumping result, returning 11000 1726867141.46714: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [0affcac9-a3a5-c734-026a-0000000000e0] 11000 1726867141.46716: sending task result for task 0affcac9-a3a5-c734-026a-0000000000e0 11000 1726867141.46782: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000e0 11000 1726867141.46784: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11000 1726867141.46849: no more pending results, returning what we have 11000 1726867141.46852: results queue empty 11000 1726867141.46853: checking for any_errors_fatal 11000 1726867141.46859: done checking for any_errors_fatal 11000 1726867141.46860: checking for max_fail_percentage 11000 1726867141.46862: done checking for max_fail_percentage 11000 1726867141.46862: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.46863: done checking to see if all hosts have failed 11000 1726867141.46864: getting the remaining hosts for this loop 11000 1726867141.46865: done getting the remaining hosts for this loop 11000 1726867141.46868: getting the next task for host managed_node1 11000 1726867141.46874: done getting next task for host managed_node1 11000 1726867141.46990: ^ task is: TASK: Set flag to indicate system is ostree 11000 1726867141.46994: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.46999: getting variables 11000 1726867141.47000: in VariableManager get_vars() 11000 1726867141.47031: Calling all_inventory to load vars for managed_node1 11000 1726867141.47034: Calling groups_inventory to load vars for managed_node1 11000 1726867141.47037: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.47047: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.47050: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.47053: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.47450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.47664: done with get_vars() 11000 1726867141.47674: done getting variables 11000 1726867141.47782: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 17:19:01 -0400 (0:00:00.633) 0:00:03.121 ****** 11000 1726867141.47810: entering _queue_task() for managed_node1/set_fact 11000 1726867141.47812: Creating lock for set_fact 11000 1726867141.48108: worker is 1 (out of 1 available) 11000 1726867141.48118: exiting _queue_task() for managed_node1/set_fact 11000 1726867141.48130: done queuing things up, now waiting for results queue to drain 11000 1726867141.48131: waiting for pending results... 11000 1726867141.48397: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 11000 1726867141.48587: in run() - task 0affcac9-a3a5-c734-026a-0000000000e1 11000 1726867141.48591: variable 'ansible_search_path' from source: unknown 11000 1726867141.48593: variable 'ansible_search_path' from source: unknown 11000 1726867141.48598: calling self._execute() 11000 1726867141.48695: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.48698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.48701: variable 'omit' from source: magic vars 11000 1726867141.49149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867141.49462: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867141.49518: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867141.49555: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867141.49607: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867141.49701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867141.49731: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867141.49759: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867141.49890: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867141.49937: Evaluated conditional (not __network_is_ostree is defined): True 11000 1726867141.49947: variable 'omit' from source: magic vars 11000 1726867141.49986: variable 'omit' from source: magic vars 11000 1726867141.50124: variable '__ostree_booted_stat' from source: set_fact 11000 1726867141.50173: variable 'omit' from source: magic vars 11000 1726867141.50203: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867141.50249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867141.50270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867141.50295: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867141.50328: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867141.50358: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867141.50383: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.50386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.50482: Set connection var ansible_shell_type to sh 11000 1726867141.50545: Set connection var ansible_pipelining to False 11000 1726867141.50548: Set connection var ansible_shell_executable to /bin/sh 11000 1726867141.50557: Set connection var ansible_connection to ssh 11000 1726867141.50560: Set connection var ansible_timeout to 10 11000 1726867141.50562: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867141.50572: variable 'ansible_shell_executable' from source: unknown 11000 1726867141.50585: variable 'ansible_connection' from source: unknown 11000 1726867141.50592: variable 'ansible_module_compression' from source: unknown 11000 1726867141.50599: variable 'ansible_shell_type' from source: unknown 11000 1726867141.50605: variable 'ansible_shell_executable' from source: unknown 11000 1726867141.50611: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.50617: variable 'ansible_pipelining' from source: unknown 11000 1726867141.50623: variable 'ansible_timeout' from source: unknown 11000 1726867141.50654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.50746: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867141.50775: variable 'omit' from source: magic vars 11000 1726867141.50870: starting attempt loop 11000 1726867141.50874: running the handler 11000 1726867141.50879: handler run complete 11000 1726867141.50882: attempt loop complete, returning result 11000 1726867141.50884: _execute() done 11000 1726867141.50886: dumping result to json 11000 1726867141.50888: done dumping result, returning 11000 1726867141.50890: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [0affcac9-a3a5-c734-026a-0000000000e1] 11000 1726867141.50892: sending task result for task 0affcac9-a3a5-c734-026a-0000000000e1 11000 1726867141.50952: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000e1 11000 1726867141.50955: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11000 1726867141.51024: no more pending results, returning what we have 11000 1726867141.51027: results queue empty 11000 1726867141.51028: checking for any_errors_fatal 11000 1726867141.51033: done checking for any_errors_fatal 11000 1726867141.51034: checking for max_fail_percentage 11000 1726867141.51036: done checking for max_fail_percentage 11000 1726867141.51037: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.51038: done checking to see if all hosts have failed 11000 1726867141.51039: getting the remaining hosts for this loop 11000 1726867141.51040: done getting the remaining hosts for this loop 11000 1726867141.51043: getting the next task for host managed_node1 11000 1726867141.51051: done getting next task for host managed_node1 11000 1726867141.51054: ^ task is: TASK: Fix CentOS6 Base repo 11000 1726867141.51056: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.51060: getting variables 11000 1726867141.51061: in VariableManager get_vars() 11000 1726867141.51300: Calling all_inventory to load vars for managed_node1 11000 1726867141.51303: Calling groups_inventory to load vars for managed_node1 11000 1726867141.51306: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.51315: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.51317: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.51325: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.51509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.51635: done with get_vars() 11000 1726867141.51641: done getting variables 11000 1726867141.51725: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 17:19:01 -0400 (0:00:00.039) 0:00:03.160 ****** 11000 1726867141.51744: entering _queue_task() for managed_node1/copy 11000 1726867141.51930: worker is 1 (out of 1 available) 11000 1726867141.51942: exiting _queue_task() for managed_node1/copy 11000 1726867141.51955: done queuing things up, now waiting for results queue to drain 11000 1726867141.51956: waiting for pending results... 11000 1726867141.52096: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 11000 1726867141.52151: in run() - task 0affcac9-a3a5-c734-026a-0000000000e3 11000 1726867141.52161: variable 'ansible_search_path' from source: unknown 11000 1726867141.52165: variable 'ansible_search_path' from source: unknown 11000 1726867141.52193: calling self._execute() 11000 1726867141.52242: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.52245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.52252: variable 'omit' from source: magic vars 11000 1726867141.52573: variable 'ansible_distribution' from source: facts 11000 1726867141.52593: Evaluated conditional (ansible_distribution == 'CentOS'): True 11000 1726867141.52672: variable 'ansible_distribution_major_version' from source: facts 11000 1726867141.52676: Evaluated conditional (ansible_distribution_major_version == '6'): False 11000 1726867141.52680: when evaluation is False, skipping this task 11000 1726867141.52683: _execute() done 11000 1726867141.52689: dumping result to json 11000 1726867141.52692: done dumping result, returning 11000 1726867141.52695: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [0affcac9-a3a5-c734-026a-0000000000e3] 11000 1726867141.52704: sending task result for task 0affcac9-a3a5-c734-026a-0000000000e3 11000 1726867141.52796: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000e3 11000 1726867141.52799: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11000 1726867141.52883: no more pending results, returning what we have 11000 1726867141.52888: results queue empty 11000 1726867141.52889: checking for any_errors_fatal 11000 1726867141.52892: done checking for any_errors_fatal 11000 1726867141.52893: checking for max_fail_percentage 11000 1726867141.52894: done checking for max_fail_percentage 11000 1726867141.52895: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.52896: done checking to see if all hosts have failed 11000 1726867141.52896: getting the remaining hosts for this loop 11000 1726867141.52898: done getting the remaining hosts for this loop 11000 1726867141.52901: getting the next task for host managed_node1 11000 1726867141.52905: done getting next task for host managed_node1 11000 1726867141.52908: ^ task is: TASK: Include the task 'enable_epel.yml' 11000 1726867141.52911: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.52915: getting variables 11000 1726867141.52916: in VariableManager get_vars() 11000 1726867141.52937: Calling all_inventory to load vars for managed_node1 11000 1726867141.52939: Calling groups_inventory to load vars for managed_node1 11000 1726867141.52941: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.52947: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.52948: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.52950: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.53054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.53237: done with get_vars() 11000 1726867141.53245: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 17:19:01 -0400 (0:00:00.015) 0:00:03.176 ****** 11000 1726867141.53334: entering _queue_task() for managed_node1/include_tasks 11000 1726867141.53604: worker is 1 (out of 1 available) 11000 1726867141.53616: exiting _queue_task() for managed_node1/include_tasks 11000 1726867141.53631: done queuing things up, now waiting for results queue to drain 11000 1726867141.53633: waiting for pending results... 11000 1726867141.53900: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 11000 1726867141.53905: in run() - task 0affcac9-a3a5-c734-026a-0000000000e4 11000 1726867141.53910: variable 'ansible_search_path' from source: unknown 11000 1726867141.53913: variable 'ansible_search_path' from source: unknown 11000 1726867141.53941: calling self._execute() 11000 1726867141.54025: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.54044: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.54057: variable 'omit' from source: magic vars 11000 1726867141.54620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867141.56326: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867141.56375: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867141.56403: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867141.56431: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867141.56452: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867141.56509: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867141.56532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867141.56551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867141.56579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867141.56643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867141.56672: variable '__network_is_ostree' from source: set_fact 11000 1726867141.56689: Evaluated conditional (not __network_is_ostree | d(false)): True 11000 1726867141.56693: _execute() done 11000 1726867141.56695: dumping result to json 11000 1726867141.56697: done dumping result, returning 11000 1726867141.56703: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [0affcac9-a3a5-c734-026a-0000000000e4] 11000 1726867141.56706: sending task result for task 0affcac9-a3a5-c734-026a-0000000000e4 11000 1726867141.56790: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000e4 11000 1726867141.56793: WORKER PROCESS EXITING 11000 1726867141.56821: no more pending results, returning what we have 11000 1726867141.56826: in VariableManager get_vars() 11000 1726867141.56857: Calling all_inventory to load vars for managed_node1 11000 1726867141.56860: Calling groups_inventory to load vars for managed_node1 11000 1726867141.56863: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.56872: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.56875: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.56879: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.57045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.57152: done with get_vars() 11000 1726867141.57158: variable 'ansible_search_path' from source: unknown 11000 1726867141.57159: variable 'ansible_search_path' from source: unknown 11000 1726867141.57184: we have included files to process 11000 1726867141.57185: generating all_blocks data 11000 1726867141.57187: done generating all_blocks data 11000 1726867141.57192: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11000 1726867141.57193: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11000 1726867141.57195: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11000 1726867141.57847: done processing included file 11000 1726867141.57850: iterating over new_blocks loaded from include file 11000 1726867141.57851: in VariableManager get_vars() 11000 1726867141.57861: done with get_vars() 11000 1726867141.57863: filtering new block on tags 11000 1726867141.57889: done filtering new block on tags 11000 1726867141.57892: in VariableManager get_vars() 11000 1726867141.57902: done with get_vars() 11000 1726867141.57904: filtering new block on tags 11000 1726867141.57924: done filtering new block on tags 11000 1726867141.57926: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 11000 1726867141.57931: extending task lists for all hosts with included blocks 11000 1726867141.58057: done extending task lists 11000 1726867141.58084: done processing included files 11000 1726867141.58088: results queue empty 11000 1726867141.58089: checking for any_errors_fatal 11000 1726867141.58092: done checking for any_errors_fatal 11000 1726867141.58093: checking for max_fail_percentage 11000 1726867141.58094: done checking for max_fail_percentage 11000 1726867141.58095: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.58096: done checking to see if all hosts have failed 11000 1726867141.58096: getting the remaining hosts for this loop 11000 1726867141.58097: done getting the remaining hosts for this loop 11000 1726867141.58100: getting the next task for host managed_node1 11000 1726867141.58104: done getting next task for host managed_node1 11000 1726867141.58106: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11000 1726867141.58108: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.58110: getting variables 11000 1726867141.58111: in VariableManager get_vars() 11000 1726867141.58118: Calling all_inventory to load vars for managed_node1 11000 1726867141.58120: Calling groups_inventory to load vars for managed_node1 11000 1726867141.58123: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.58127: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.58158: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.58162: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.58313: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.58428: done with get_vars() 11000 1726867141.58435: done getting variables 11000 1726867141.58489: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11000 1726867141.58626: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 17:19:01 -0400 (0:00:00.053) 0:00:03.229 ****** 11000 1726867141.58657: entering _queue_task() for managed_node1/command 11000 1726867141.58658: Creating lock for command 11000 1726867141.58857: worker is 1 (out of 1 available) 11000 1726867141.58868: exiting _queue_task() for managed_node1/command 11000 1726867141.58881: done queuing things up, now waiting for results queue to drain 11000 1726867141.58882: waiting for pending results... 11000 1726867141.59019: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 11000 1726867141.59083: in run() - task 0affcac9-a3a5-c734-026a-0000000000fe 11000 1726867141.59093: variable 'ansible_search_path' from source: unknown 11000 1726867141.59096: variable 'ansible_search_path' from source: unknown 11000 1726867141.59123: calling self._execute() 11000 1726867141.59176: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.59181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.59219: variable 'omit' from source: magic vars 11000 1726867141.59442: variable 'ansible_distribution' from source: facts 11000 1726867141.59451: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11000 1726867141.59538: variable 'ansible_distribution_major_version' from source: facts 11000 1726867141.59542: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11000 1726867141.59545: when evaluation is False, skipping this task 11000 1726867141.59548: _execute() done 11000 1726867141.59551: dumping result to json 11000 1726867141.59553: done dumping result, returning 11000 1726867141.59559: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [0affcac9-a3a5-c734-026a-0000000000fe] 11000 1726867141.59562: sending task result for task 0affcac9-a3a5-c734-026a-0000000000fe 11000 1726867141.59651: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000fe 11000 1726867141.59654: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11000 1726867141.59712: no more pending results, returning what we have 11000 1726867141.59715: results queue empty 11000 1726867141.59716: checking for any_errors_fatal 11000 1726867141.59717: done checking for any_errors_fatal 11000 1726867141.59717: checking for max_fail_percentage 11000 1726867141.59719: done checking for max_fail_percentage 11000 1726867141.59719: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.59720: done checking to see if all hosts have failed 11000 1726867141.59721: getting the remaining hosts for this loop 11000 1726867141.59722: done getting the remaining hosts for this loop 11000 1726867141.59724: getting the next task for host managed_node1 11000 1726867141.59728: done getting next task for host managed_node1 11000 1726867141.59730: ^ task is: TASK: Install yum-utils package 11000 1726867141.59733: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.59736: getting variables 11000 1726867141.59737: in VariableManager get_vars() 11000 1726867141.59757: Calling all_inventory to load vars for managed_node1 11000 1726867141.59760: Calling groups_inventory to load vars for managed_node1 11000 1726867141.59762: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.59770: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.59772: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.59775: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.59899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.60010: done with get_vars() 11000 1726867141.60016: done getting variables 11000 1726867141.60076: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 17:19:01 -0400 (0:00:00.014) 0:00:03.244 ****** 11000 1726867141.60100: entering _queue_task() for managed_node1/package 11000 1726867141.60101: Creating lock for package 11000 1726867141.60274: worker is 1 (out of 1 available) 11000 1726867141.60289: exiting _queue_task() for managed_node1/package 11000 1726867141.60300: done queuing things up, now waiting for results queue to drain 11000 1726867141.60301: waiting for pending results... 11000 1726867141.60424: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 11000 1726867141.60490: in run() - task 0affcac9-a3a5-c734-026a-0000000000ff 11000 1726867141.60497: variable 'ansible_search_path' from source: unknown 11000 1726867141.60500: variable 'ansible_search_path' from source: unknown 11000 1726867141.60525: calling self._execute() 11000 1726867141.60575: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.60580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.60590: variable 'omit' from source: magic vars 11000 1726867141.60831: variable 'ansible_distribution' from source: facts 11000 1726867141.60840: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11000 1726867141.60926: variable 'ansible_distribution_major_version' from source: facts 11000 1726867141.60930: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11000 1726867141.60933: when evaluation is False, skipping this task 11000 1726867141.60936: _execute() done 11000 1726867141.60938: dumping result to json 11000 1726867141.60941: done dumping result, returning 11000 1726867141.60947: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [0affcac9-a3a5-c734-026a-0000000000ff] 11000 1726867141.60951: sending task result for task 0affcac9-a3a5-c734-026a-0000000000ff 11000 1726867141.61034: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000ff 11000 1726867141.61037: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11000 1726867141.61075: no more pending results, returning what we have 11000 1726867141.61080: results queue empty 11000 1726867141.61081: checking for any_errors_fatal 11000 1726867141.61088: done checking for any_errors_fatal 11000 1726867141.61089: checking for max_fail_percentage 11000 1726867141.61090: done checking for max_fail_percentage 11000 1726867141.61091: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.61092: done checking to see if all hosts have failed 11000 1726867141.61092: getting the remaining hosts for this loop 11000 1726867141.61093: done getting the remaining hosts for this loop 11000 1726867141.61096: getting the next task for host managed_node1 11000 1726867141.61100: done getting next task for host managed_node1 11000 1726867141.61103: ^ task is: TASK: Enable EPEL 7 11000 1726867141.61106: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.61108: getting variables 11000 1726867141.61109: in VariableManager get_vars() 11000 1726867141.61131: Calling all_inventory to load vars for managed_node1 11000 1726867141.61133: Calling groups_inventory to load vars for managed_node1 11000 1726867141.61135: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.61142: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.61145: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.61148: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.61247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.61359: done with get_vars() 11000 1726867141.61367: done getting variables 11000 1726867141.61407: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 17:19:01 -0400 (0:00:00.013) 0:00:03.257 ****** 11000 1726867141.61424: entering _queue_task() for managed_node1/command 11000 1726867141.61591: worker is 1 (out of 1 available) 11000 1726867141.61602: exiting _queue_task() for managed_node1/command 11000 1726867141.61615: done queuing things up, now waiting for results queue to drain 11000 1726867141.61616: waiting for pending results... 11000 1726867141.61738: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 11000 1726867141.61804: in run() - task 0affcac9-a3a5-c734-026a-000000000100 11000 1726867141.61812: variable 'ansible_search_path' from source: unknown 11000 1726867141.61816: variable 'ansible_search_path' from source: unknown 11000 1726867141.61840: calling self._execute() 11000 1726867141.61974: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.61980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.61983: variable 'omit' from source: magic vars 11000 1726867141.62175: variable 'ansible_distribution' from source: facts 11000 1726867141.62195: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11000 1726867141.62268: variable 'ansible_distribution_major_version' from source: facts 11000 1726867141.62271: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11000 1726867141.62274: when evaluation is False, skipping this task 11000 1726867141.62279: _execute() done 11000 1726867141.62281: dumping result to json 11000 1726867141.62289: done dumping result, returning 11000 1726867141.62292: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [0affcac9-a3a5-c734-026a-000000000100] 11000 1726867141.62296: sending task result for task 0affcac9-a3a5-c734-026a-000000000100 11000 1726867141.62374: done sending task result for task 0affcac9-a3a5-c734-026a-000000000100 11000 1726867141.62376: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11000 1726867141.62441: no more pending results, returning what we have 11000 1726867141.62444: results queue empty 11000 1726867141.62445: checking for any_errors_fatal 11000 1726867141.62449: done checking for any_errors_fatal 11000 1726867141.62449: checking for max_fail_percentage 11000 1726867141.62451: done checking for max_fail_percentage 11000 1726867141.62452: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.62452: done checking to see if all hosts have failed 11000 1726867141.62453: getting the remaining hosts for this loop 11000 1726867141.62454: done getting the remaining hosts for this loop 11000 1726867141.62457: getting the next task for host managed_node1 11000 1726867141.62462: done getting next task for host managed_node1 11000 1726867141.62463: ^ task is: TASK: Enable EPEL 8 11000 1726867141.62466: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.62469: getting variables 11000 1726867141.62470: in VariableManager get_vars() 11000 1726867141.62498: Calling all_inventory to load vars for managed_node1 11000 1726867141.62500: Calling groups_inventory to load vars for managed_node1 11000 1726867141.62502: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.62508: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.62509: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.62511: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.62632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.62743: done with get_vars() 11000 1726867141.62749: done getting variables 11000 1726867141.62786: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 17:19:01 -0400 (0:00:00.013) 0:00:03.271 ****** 11000 1726867141.62806: entering _queue_task() for managed_node1/command 11000 1726867141.62962: worker is 1 (out of 1 available) 11000 1726867141.62973: exiting _queue_task() for managed_node1/command 11000 1726867141.62985: done queuing things up, now waiting for results queue to drain 11000 1726867141.62986: waiting for pending results... 11000 1726867141.63129: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 11000 1726867141.63200: in run() - task 0affcac9-a3a5-c734-026a-000000000101 11000 1726867141.63211: variable 'ansible_search_path' from source: unknown 11000 1726867141.63215: variable 'ansible_search_path' from source: unknown 11000 1726867141.63241: calling self._execute() 11000 1726867141.63293: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.63298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.63306: variable 'omit' from source: magic vars 11000 1726867141.63566: variable 'ansible_distribution' from source: facts 11000 1726867141.63581: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11000 1726867141.63664: variable 'ansible_distribution_major_version' from source: facts 11000 1726867141.63668: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11000 1726867141.63671: when evaluation is False, skipping this task 11000 1726867141.63674: _execute() done 11000 1726867141.63676: dumping result to json 11000 1726867141.63682: done dumping result, returning 11000 1726867141.63696: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [0affcac9-a3a5-c734-026a-000000000101] 11000 1726867141.63699: sending task result for task 0affcac9-a3a5-c734-026a-000000000101 11000 1726867141.63775: done sending task result for task 0affcac9-a3a5-c734-026a-000000000101 11000 1726867141.63781: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11000 1726867141.63824: no more pending results, returning what we have 11000 1726867141.63826: results queue empty 11000 1726867141.63827: checking for any_errors_fatal 11000 1726867141.63830: done checking for any_errors_fatal 11000 1726867141.63830: checking for max_fail_percentage 11000 1726867141.63832: done checking for max_fail_percentage 11000 1726867141.63833: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.63833: done checking to see if all hosts have failed 11000 1726867141.63834: getting the remaining hosts for this loop 11000 1726867141.63835: done getting the remaining hosts for this loop 11000 1726867141.63838: getting the next task for host managed_node1 11000 1726867141.63844: done getting next task for host managed_node1 11000 1726867141.63846: ^ task is: TASK: Enable EPEL 6 11000 1726867141.63849: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.63852: getting variables 11000 1726867141.63853: in VariableManager get_vars() 11000 1726867141.63874: Calling all_inventory to load vars for managed_node1 11000 1726867141.63880: Calling groups_inventory to load vars for managed_node1 11000 1726867141.63883: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.63890: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.63893: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.63895: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.63997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.64108: done with get_vars() 11000 1726867141.64114: done getting variables 11000 1726867141.64151: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 17:19:01 -0400 (0:00:00.013) 0:00:03.284 ****** 11000 1726867141.64169: entering _queue_task() for managed_node1/copy 11000 1726867141.64324: worker is 1 (out of 1 available) 11000 1726867141.64334: exiting _queue_task() for managed_node1/copy 11000 1726867141.64344: done queuing things up, now waiting for results queue to drain 11000 1726867141.64346: waiting for pending results... 11000 1726867141.64481: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 11000 1726867141.64535: in run() - task 0affcac9-a3a5-c734-026a-000000000103 11000 1726867141.64545: variable 'ansible_search_path' from source: unknown 11000 1726867141.64548: variable 'ansible_search_path' from source: unknown 11000 1726867141.64580: calling self._execute() 11000 1726867141.64623: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.64627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.64636: variable 'omit' from source: magic vars 11000 1726867141.64920: variable 'ansible_distribution' from source: facts 11000 1726867141.64930: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11000 1726867141.65005: variable 'ansible_distribution_major_version' from source: facts 11000 1726867141.65009: Evaluated conditional (ansible_distribution_major_version == '6'): False 11000 1726867141.65012: when evaluation is False, skipping this task 11000 1726867141.65014: _execute() done 11000 1726867141.65017: dumping result to json 11000 1726867141.65019: done dumping result, returning 11000 1726867141.65031: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [0affcac9-a3a5-c734-026a-000000000103] 11000 1726867141.65034: sending task result for task 0affcac9-a3a5-c734-026a-000000000103 11000 1726867141.65109: done sending task result for task 0affcac9-a3a5-c734-026a-000000000103 11000 1726867141.65112: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11000 1726867141.65164: no more pending results, returning what we have 11000 1726867141.65166: results queue empty 11000 1726867141.65167: checking for any_errors_fatal 11000 1726867141.65170: done checking for any_errors_fatal 11000 1726867141.65171: checking for max_fail_percentage 11000 1726867141.65173: done checking for max_fail_percentage 11000 1726867141.65173: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.65174: done checking to see if all hosts have failed 11000 1726867141.65175: getting the remaining hosts for this loop 11000 1726867141.65176: done getting the remaining hosts for this loop 11000 1726867141.65188: getting the next task for host managed_node1 11000 1726867141.65195: done getting next task for host managed_node1 11000 1726867141.65197: ^ task is: TASK: Set network provider to 'nm' 11000 1726867141.65199: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.65202: getting variables 11000 1726867141.65203: in VariableManager get_vars() 11000 1726867141.65221: Calling all_inventory to load vars for managed_node1 11000 1726867141.65223: Calling groups_inventory to load vars for managed_node1 11000 1726867141.65224: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.65231: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.65233: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.65234: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.65360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.65470: done with get_vars() 11000 1726867141.65476: done getting variables 11000 1726867141.65519: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:13 Friday 20 September 2024 17:19:01 -0400 (0:00:00.013) 0:00:03.298 ****** 11000 1726867141.65536: entering _queue_task() for managed_node1/set_fact 11000 1726867141.65689: worker is 1 (out of 1 available) 11000 1726867141.65701: exiting _queue_task() for managed_node1/set_fact 11000 1726867141.65710: done queuing things up, now waiting for results queue to drain 11000 1726867141.65711: waiting for pending results... 11000 1726867141.65840: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 11000 1726867141.65895: in run() - task 0affcac9-a3a5-c734-026a-000000000007 11000 1726867141.65905: variable 'ansible_search_path' from source: unknown 11000 1726867141.65933: calling self._execute() 11000 1726867141.65988: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.65992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.65998: variable 'omit' from source: magic vars 11000 1726867141.66070: variable 'omit' from source: magic vars 11000 1726867141.66094: variable 'omit' from source: magic vars 11000 1726867141.66117: variable 'omit' from source: magic vars 11000 1726867141.66147: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867141.66175: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867141.66200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867141.66214: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867141.66224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867141.66246: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867141.66249: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.66251: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.66326: Set connection var ansible_shell_type to sh 11000 1726867141.66333: Set connection var ansible_pipelining to False 11000 1726867141.66340: Set connection var ansible_shell_executable to /bin/sh 11000 1726867141.66343: Set connection var ansible_connection to ssh 11000 1726867141.66348: Set connection var ansible_timeout to 10 11000 1726867141.66353: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867141.66372: variable 'ansible_shell_executable' from source: unknown 11000 1726867141.66374: variable 'ansible_connection' from source: unknown 11000 1726867141.66378: variable 'ansible_module_compression' from source: unknown 11000 1726867141.66381: variable 'ansible_shell_type' from source: unknown 11000 1726867141.66383: variable 'ansible_shell_executable' from source: unknown 11000 1726867141.66388: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.66390: variable 'ansible_pipelining' from source: unknown 11000 1726867141.66392: variable 'ansible_timeout' from source: unknown 11000 1726867141.66404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.66495: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867141.66514: variable 'omit' from source: magic vars 11000 1726867141.66517: starting attempt loop 11000 1726867141.66520: running the handler 11000 1726867141.66523: handler run complete 11000 1726867141.66530: attempt loop complete, returning result 11000 1726867141.66533: _execute() done 11000 1726867141.66535: dumping result to json 11000 1726867141.66537: done dumping result, returning 11000 1726867141.66544: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [0affcac9-a3a5-c734-026a-000000000007] 11000 1726867141.66548: sending task result for task 0affcac9-a3a5-c734-026a-000000000007 11000 1726867141.66626: done sending task result for task 0affcac9-a3a5-c734-026a-000000000007 11000 1726867141.66629: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11000 1726867141.66680: no more pending results, returning what we have 11000 1726867141.66682: results queue empty 11000 1726867141.66683: checking for any_errors_fatal 11000 1726867141.66688: done checking for any_errors_fatal 11000 1726867141.66689: checking for max_fail_percentage 11000 1726867141.66690: done checking for max_fail_percentage 11000 1726867141.66691: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.66691: done checking to see if all hosts have failed 11000 1726867141.66692: getting the remaining hosts for this loop 11000 1726867141.66693: done getting the remaining hosts for this loop 11000 1726867141.66696: getting the next task for host managed_node1 11000 1726867141.66700: done getting next task for host managed_node1 11000 1726867141.66702: ^ task is: TASK: meta (flush_handlers) 11000 1726867141.66703: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.66706: getting variables 11000 1726867141.66707: in VariableManager get_vars() 11000 1726867141.66728: Calling all_inventory to load vars for managed_node1 11000 1726867141.66730: Calling groups_inventory to load vars for managed_node1 11000 1726867141.66733: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.66748: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.66750: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.66752: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.66856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.67088: done with get_vars() 11000 1726867141.67094: done getting variables 11000 1726867141.67133: in VariableManager get_vars() 11000 1726867141.67139: Calling all_inventory to load vars for managed_node1 11000 1726867141.67140: Calling groups_inventory to load vars for managed_node1 11000 1726867141.67141: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.67144: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.67145: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.67147: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.67226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.67328: done with get_vars() 11000 1726867141.67336: done queuing things up, now waiting for results queue to drain 11000 1726867141.67337: results queue empty 11000 1726867141.67337: checking for any_errors_fatal 11000 1726867141.67339: done checking for any_errors_fatal 11000 1726867141.67339: checking for max_fail_percentage 11000 1726867141.67340: done checking for max_fail_percentage 11000 1726867141.67340: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.67340: done checking to see if all hosts have failed 11000 1726867141.67341: getting the remaining hosts for this loop 11000 1726867141.67341: done getting the remaining hosts for this loop 11000 1726867141.67343: getting the next task for host managed_node1 11000 1726867141.67345: done getting next task for host managed_node1 11000 1726867141.67346: ^ task is: TASK: meta (flush_handlers) 11000 1726867141.67347: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.67352: getting variables 11000 1726867141.67352: in VariableManager get_vars() 11000 1726867141.67357: Calling all_inventory to load vars for managed_node1 11000 1726867141.67358: Calling groups_inventory to load vars for managed_node1 11000 1726867141.67360: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.67362: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.67364: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.67365: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.67447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.67562: done with get_vars() 11000 1726867141.67567: done getting variables 11000 1726867141.67598: in VariableManager get_vars() 11000 1726867141.67604: Calling all_inventory to load vars for managed_node1 11000 1726867141.67605: Calling groups_inventory to load vars for managed_node1 11000 1726867141.67606: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.67609: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.67611: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.67613: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.67691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.67799: done with get_vars() 11000 1726867141.67806: done queuing things up, now waiting for results queue to drain 11000 1726867141.67807: results queue empty 11000 1726867141.67808: checking for any_errors_fatal 11000 1726867141.67808: done checking for any_errors_fatal 11000 1726867141.67809: checking for max_fail_percentage 11000 1726867141.67809: done checking for max_fail_percentage 11000 1726867141.67810: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.67810: done checking to see if all hosts have failed 11000 1726867141.67811: getting the remaining hosts for this loop 11000 1726867141.67811: done getting the remaining hosts for this loop 11000 1726867141.67813: getting the next task for host managed_node1 11000 1726867141.67814: done getting next task for host managed_node1 11000 1726867141.67815: ^ task is: None 11000 1726867141.67815: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.67816: done queuing things up, now waiting for results queue to drain 11000 1726867141.67817: results queue empty 11000 1726867141.67817: checking for any_errors_fatal 11000 1726867141.67817: done checking for any_errors_fatal 11000 1726867141.67818: checking for max_fail_percentage 11000 1726867141.67818: done checking for max_fail_percentage 11000 1726867141.67819: checking to see if all hosts have failed and the running result is not ok 11000 1726867141.67819: done checking to see if all hosts have failed 11000 1726867141.67820: getting the next task for host managed_node1 11000 1726867141.67822: done getting next task for host managed_node1 11000 1726867141.67822: ^ task is: None 11000 1726867141.67823: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.67855: in VariableManager get_vars() 11000 1726867141.67870: done with get_vars() 11000 1726867141.67874: in VariableManager get_vars() 11000 1726867141.67885: done with get_vars() 11000 1726867141.67888: variable 'omit' from source: magic vars 11000 1726867141.67909: in VariableManager get_vars() 11000 1726867141.67918: done with get_vars() 11000 1726867141.67931: variable 'omit' from source: magic vars PLAY [Play for testing bond device using deprecated 'master' argument] ********* 11000 1726867141.68317: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11000 1726867141.68336: getting the remaining hosts for this loop 11000 1726867141.68337: done getting the remaining hosts for this loop 11000 1726867141.68339: getting the next task for host managed_node1 11000 1726867141.68341: done getting next task for host managed_node1 11000 1726867141.68342: ^ task is: TASK: Gathering Facts 11000 1726867141.68343: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867141.68344: getting variables 11000 1726867141.68344: in VariableManager get_vars() 11000 1726867141.68353: Calling all_inventory to load vars for managed_node1 11000 1726867141.68354: Calling groups_inventory to load vars for managed_node1 11000 1726867141.68355: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867141.68358: Calling all_plugins_play to load vars for managed_node1 11000 1726867141.68366: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867141.68368: Calling groups_plugins_play to load vars for managed_node1 11000 1726867141.68462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867141.68567: done with get_vars() 11000 1726867141.68572: done getting variables 11000 1726867141.68602: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Friday 20 September 2024 17:19:01 -0400 (0:00:00.030) 0:00:03.329 ****** 11000 1726867141.68617: entering _queue_task() for managed_node1/gather_facts 11000 1726867141.68780: worker is 1 (out of 1 available) 11000 1726867141.68792: exiting _queue_task() for managed_node1/gather_facts 11000 1726867141.68801: done queuing things up, now waiting for results queue to drain 11000 1726867141.68802: waiting for pending results... 11000 1726867141.68934: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11000 1726867141.68993: in run() - task 0affcac9-a3a5-c734-026a-000000000129 11000 1726867141.69003: variable 'ansible_search_path' from source: unknown 11000 1726867141.69029: calling self._execute() 11000 1726867141.69086: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.69094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.69102: variable 'omit' from source: magic vars 11000 1726867141.69354: variable 'ansible_distribution_major_version' from source: facts 11000 1726867141.69362: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867141.69367: variable 'omit' from source: magic vars 11000 1726867141.69388: variable 'omit' from source: magic vars 11000 1726867141.69412: variable 'omit' from source: magic vars 11000 1726867141.69440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867141.69466: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867141.69485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867141.69501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867141.69510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867141.69532: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867141.69535: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.69537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.69605: Set connection var ansible_shell_type to sh 11000 1726867141.69613: Set connection var ansible_pipelining to False 11000 1726867141.69620: Set connection var ansible_shell_executable to /bin/sh 11000 1726867141.69622: Set connection var ansible_connection to ssh 11000 1726867141.69627: Set connection var ansible_timeout to 10 11000 1726867141.69632: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867141.69651: variable 'ansible_shell_executable' from source: unknown 11000 1726867141.69654: variable 'ansible_connection' from source: unknown 11000 1726867141.69657: variable 'ansible_module_compression' from source: unknown 11000 1726867141.69660: variable 'ansible_shell_type' from source: unknown 11000 1726867141.69662: variable 'ansible_shell_executable' from source: unknown 11000 1726867141.69664: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867141.69666: variable 'ansible_pipelining' from source: unknown 11000 1726867141.69669: variable 'ansible_timeout' from source: unknown 11000 1726867141.69674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867141.69803: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867141.69820: variable 'omit' from source: magic vars 11000 1726867141.69823: starting attempt loop 11000 1726867141.69826: running the handler 11000 1726867141.69831: variable 'ansible_facts' from source: unknown 11000 1726867141.69847: _low_level_execute_command(): starting 11000 1726867141.69853: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867141.70367: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867141.70371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867141.70374: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.70432: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867141.70437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867141.70439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.70504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.72898: stdout chunk (state=3): >>>/root <<< 11000 1726867141.73043: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867141.73081: stderr chunk (state=3): >>><<< 11000 1726867141.73084: stdout chunk (state=3): >>><<< 11000 1726867141.73100: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867141.73111: _low_level_execute_command(): starting 11000 1726867141.73118: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576 `" && echo ansible-tmp-1726867141.7310061-11162-222303903632576="` echo /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576 `" ) && sleep 0' 11000 1726867141.73571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867141.73574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867141.73579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.73588: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867141.73591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.73645: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867141.73648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.73697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.76488: stdout chunk (state=3): >>>ansible-tmp-1726867141.7310061-11162-222303903632576=/root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576 <<< 11000 1726867141.76623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867141.76650: stderr chunk (state=3): >>><<< 11000 1726867141.76653: stdout chunk (state=3): >>><<< 11000 1726867141.76666: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867141.7310061-11162-222303903632576=/root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867141.76725: variable 'ansible_module_compression' from source: unknown 11000 1726867141.76745: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11000 1726867141.76794: variable 'ansible_facts' from source: unknown 11000 1726867141.76928: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/AnsiballZ_setup.py 11000 1726867141.77028: Sending initial data 11000 1726867141.77031: Sent initial data (154 bytes) 11000 1726867141.77474: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867141.77479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867141.77482: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.77484: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867141.77490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867141.77492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.77535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867141.77538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.77596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.79794: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11000 1726867141.79798: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867141.79841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867141.79890: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpwj8ha0be /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/AnsiballZ_setup.py <<< 11000 1726867141.79896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/AnsiballZ_setup.py" <<< 11000 1726867141.79935: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpwj8ha0be" to remote "/root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/AnsiballZ_setup.py" <<< 11000 1726867141.81028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867141.81064: stderr chunk (state=3): >>><<< 11000 1726867141.81069: stdout chunk (state=3): >>><<< 11000 1726867141.81089: done transferring module to remote 11000 1726867141.81095: _low_level_execute_command(): starting 11000 1726867141.81100: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/ /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/AnsiballZ_setup.py && sleep 0' 11000 1726867141.81517: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867141.81520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867141.81523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867141.81525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867141.81530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.81573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867141.81576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.81629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867141.84198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867141.84218: stderr chunk (state=3): >>><<< 11000 1726867141.84221: stdout chunk (state=3): >>><<< 11000 1726867141.84238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867141.84242: _low_level_execute_command(): starting 11000 1726867141.84244: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/AnsiballZ_setup.py && sleep 0' 11000 1726867141.84636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867141.84639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.84641: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867141.84644: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867141.84646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867141.84699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867141.84705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867141.84754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867142.67662: stdout chunk (state=3): >>> <<< 11000 1726867142.67757: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "02", "epoch": "1726867142", "epoch_int": "1726867142", "date": "2024-09-20", "time": "17:19:02", "iso8601_micro": "2024-09-20T21:19:02.275289Z", "iso8601": "2024-09-20T21:19:02Z", "iso8601_basic": "20240920T171902275289", "iso8601_basic_short": "20240920T171902", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.3310546875, "5m": 0.2265625, "15m": 0.10986328125}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_<<< 11000 1726867142.67870: stdout chunk (state=3): >>>iqn": "", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 387, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793374208, "block_size": 4096, "block_total": 65519099, "block_available": 63914398, "block_used": 1604701, "inode_total": 131070960, "inode_available": 131029068, "inode_used": 41892, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.2<<< 11000 1726867142.67881: stdout chunk (state=3): >>>55", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11000 1726867142.70598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867142.70603: stdout chunk (state=3): >>><<< 11000 1726867142.70605: stderr chunk (state=3): >>><<< 11000 1726867142.70793: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/1", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.65 32980 10.31.12.57 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.65 32980 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-12-57", "ansible_nodename": "ip-10-31-12-57.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec293fb3626e3a20695ae06b45478339", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "17", "minute": "19", "second": "02", "epoch": "1726867142", "epoch_int": "1726867142", "date": "2024-09-20", "time": "17:19:02", "iso8601_micro": "2024-09-20T21:19:02.275289Z", "iso8601": "2024-09-20T21:19:02Z", "iso8601_basic": "20240920T171902275289", "iso8601_basic_short": "20240920T171902", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.3310546875, "5m": 0.2265625, "15m": 0.10986328125}, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC7JVDfMeZKYw4NvDf4J6T4eu3duEI1TDN8eY5Ag46A+Ty47bFYfPmW8jVxlz3g+Tlfs7803yjUxR8BhfnXFZj/ShR0Zt/NELUYUVHxS02yzVAX46Y/KQOzI9qRt8tn6zOckZ/+JxKdaH4KujKn7hn6gshq1vw8EYiHTG0Qh6hfm5GPWLD5l6fJeToO5P4jLX8zZS6NMoZR+K0P0F/xOkWEwjI1nJbD4GE/YiqzqLHq6U6rqEJJJWonNID6UzPfdWm+n8LyKoVCKBkDEBVl2RUr8Tsnq4MvYG+29djt/3smMIshPKMV+5fzmOvIUzv2YNfQB8w6aFoUnL8qSaEvV8A/30HdDOfRMCUanxsl1eSz0oMgGgwuQGW+lT1FSzP9U9mEQM92nj5Xgp0vf3oGttMW7RHoOjnkx3T8GVpOuPHnV0/Za7EXFaFun607WeBN2SsoO8UQ5HyKRLlC6ISzWOkWAc0L6v/tAMtxHQG5Bp40E0MHFpDc2SEbbFD+SVTfFQM=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBV4LdcoMAl+JydFQSAxZ6GfPzd/6UfaeOa/SPTjnrI5J8u4+cAsuyFQSKSblfcVNXleTIvzCZHrC699g4HQaHE=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAII78+YWuBOZy60GFrh19oZTZhmiNQUWzC28D2cLLUyoq", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2950, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 581, "free": 2950}, "nocache": {"free": 3283, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_uuid": "ec293fb3-626e-3a20-695a-e06b45478339", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 387, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793374208, "block_size": 4096, "block_total": 65519099, "block_available": 63914398, "block_used": 1604701, "inode_total": 131070960, "inode_available": 131029068, "inode_used": 41892, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:feff:fed3:7d4f", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.12.57", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:fe:d3:7d:4f", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.12.57"], "ansible_all_ipv6_addresses": ["fe80::8ff:feff:fed3:7d4f"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.12.57", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:feff:fed3:7d4f"]}, "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867142.70992: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867142.71012: _low_level_execute_command(): starting 11000 1726867142.71015: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867141.7310061-11162-222303903632576/ > /dev/null 2>&1 && sleep 0' 11000 1726867142.71416: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867142.71420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867142.71433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867142.71485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867142.71501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867142.71562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867142.74250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867142.74253: stdout chunk (state=3): >>><<< 11000 1726867142.74255: stderr chunk (state=3): >>><<< 11000 1726867142.74268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867142.74482: handler run complete 11000 1726867142.74485: variable 'ansible_facts' from source: unknown 11000 1726867142.74509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867142.74816: variable 'ansible_facts' from source: unknown 11000 1726867142.74906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867142.75045: attempt loop complete, returning result 11000 1726867142.75054: _execute() done 11000 1726867142.75060: dumping result to json 11000 1726867142.75096: done dumping result, returning 11000 1726867142.75107: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [0affcac9-a3a5-c734-026a-000000000129] 11000 1726867142.75115: sending task result for task 0affcac9-a3a5-c734-026a-000000000129 ok: [managed_node1] 11000 1726867142.75826: no more pending results, returning what we have 11000 1726867142.75829: results queue empty 11000 1726867142.75830: checking for any_errors_fatal 11000 1726867142.75831: done checking for any_errors_fatal 11000 1726867142.75832: checking for max_fail_percentage 11000 1726867142.75834: done checking for max_fail_percentage 11000 1726867142.75835: checking to see if all hosts have failed and the running result is not ok 11000 1726867142.75835: done checking to see if all hosts have failed 11000 1726867142.75836: getting the remaining hosts for this loop 11000 1726867142.75837: done getting the remaining hosts for this loop 11000 1726867142.75841: getting the next task for host managed_node1 11000 1726867142.75847: done getting next task for host managed_node1 11000 1726867142.75848: ^ task is: TASK: meta (flush_handlers) 11000 1726867142.75851: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867142.75854: getting variables 11000 1726867142.75855: in VariableManager get_vars() 11000 1726867142.75890: Calling all_inventory to load vars for managed_node1 11000 1726867142.75893: Calling groups_inventory to load vars for managed_node1 11000 1726867142.75896: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867142.76088: Calling all_plugins_play to load vars for managed_node1 11000 1726867142.76092: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867142.76095: Calling groups_plugins_play to load vars for managed_node1 11000 1726867142.76233: done sending task result for task 0affcac9-a3a5-c734-026a-000000000129 11000 1726867142.76237: WORKER PROCESS EXITING 11000 1726867142.76262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867142.76455: done with get_vars() 11000 1726867142.76465: done getting variables 11000 1726867142.76535: in VariableManager get_vars() 11000 1726867142.76548: Calling all_inventory to load vars for managed_node1 11000 1726867142.76550: Calling groups_inventory to load vars for managed_node1 11000 1726867142.76552: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867142.76556: Calling all_plugins_play to load vars for managed_node1 11000 1726867142.76558: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867142.76561: Calling groups_plugins_play to load vars for managed_node1 11000 1726867142.76709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867142.76901: done with get_vars() 11000 1726867142.76913: done queuing things up, now waiting for results queue to drain 11000 1726867142.76915: results queue empty 11000 1726867142.76916: checking for any_errors_fatal 11000 1726867142.76919: done checking for any_errors_fatal 11000 1726867142.76920: checking for max_fail_percentage 11000 1726867142.76921: done checking for max_fail_percentage 11000 1726867142.76921: checking to see if all hosts have failed and the running result is not ok 11000 1726867142.76926: done checking to see if all hosts have failed 11000 1726867142.76926: getting the remaining hosts for this loop 11000 1726867142.76927: done getting the remaining hosts for this loop 11000 1726867142.76930: getting the next task for host managed_node1 11000 1726867142.76933: done getting next task for host managed_node1 11000 1726867142.76935: ^ task is: TASK: INIT Prepare setup 11000 1726867142.76937: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867142.76939: getting variables 11000 1726867142.76939: in VariableManager get_vars() 11000 1726867142.76951: Calling all_inventory to load vars for managed_node1 11000 1726867142.76953: Calling groups_inventory to load vars for managed_node1 11000 1726867142.76954: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867142.76958: Calling all_plugins_play to load vars for managed_node1 11000 1726867142.76961: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867142.76963: Calling groups_plugins_play to load vars for managed_node1 11000 1726867142.77096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867142.77287: done with get_vars() 11000 1726867142.77295: done getting variables 11000 1726867142.77362: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:15 Friday 20 September 2024 17:19:02 -0400 (0:00:01.087) 0:00:04.417 ****** 11000 1726867142.77405: entering _queue_task() for managed_node1/debug 11000 1726867142.77407: Creating lock for debug 11000 1726867142.77645: worker is 1 (out of 1 available) 11000 1726867142.77655: exiting _queue_task() for managed_node1/debug 11000 1726867142.77666: done queuing things up, now waiting for results queue to drain 11000 1726867142.77667: waiting for pending results... 11000 1726867142.77893: running TaskExecutor() for managed_node1/TASK: INIT Prepare setup 11000 1726867142.77980: in run() - task 0affcac9-a3a5-c734-026a-00000000000b 11000 1726867142.78000: variable 'ansible_search_path' from source: unknown 11000 1726867142.78041: calling self._execute() 11000 1726867142.78125: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867142.78136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867142.78149: variable 'omit' from source: magic vars 11000 1726867142.78504: variable 'ansible_distribution_major_version' from source: facts 11000 1726867142.78521: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867142.78532: variable 'omit' from source: magic vars 11000 1726867142.78561: variable 'omit' from source: magic vars 11000 1726867142.78603: variable 'omit' from source: magic vars 11000 1726867142.78642: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867142.78688: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867142.78713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867142.78735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867142.78754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867142.78796: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867142.78805: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867142.78812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867142.78931: Set connection var ansible_shell_type to sh 11000 1726867142.78942: Set connection var ansible_pipelining to False 11000 1726867142.78955: Set connection var ansible_shell_executable to /bin/sh 11000 1726867142.78961: Set connection var ansible_connection to ssh 11000 1726867142.78969: Set connection var ansible_timeout to 10 11000 1726867142.78979: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867142.79010: variable 'ansible_shell_executable' from source: unknown 11000 1726867142.79018: variable 'ansible_connection' from source: unknown 11000 1726867142.79083: variable 'ansible_module_compression' from source: unknown 11000 1726867142.79087: variable 'ansible_shell_type' from source: unknown 11000 1726867142.79090: variable 'ansible_shell_executable' from source: unknown 11000 1726867142.79093: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867142.79095: variable 'ansible_pipelining' from source: unknown 11000 1726867142.79097: variable 'ansible_timeout' from source: unknown 11000 1726867142.79099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867142.79187: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867142.79204: variable 'omit' from source: magic vars 11000 1726867142.79220: starting attempt loop 11000 1726867142.79228: running the handler 11000 1726867142.79275: handler run complete 11000 1726867142.79300: attempt loop complete, returning result 11000 1726867142.79683: _execute() done 11000 1726867142.79686: dumping result to json 11000 1726867142.79688: done dumping result, returning 11000 1726867142.79691: done running TaskExecutor() for managed_node1/TASK: INIT Prepare setup [0affcac9-a3a5-c734-026a-00000000000b] 11000 1726867142.79693: sending task result for task 0affcac9-a3a5-c734-026a-00000000000b 11000 1726867142.79749: done sending task result for task 0affcac9-a3a5-c734-026a-00000000000b 11000 1726867142.79752: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 11000 1726867142.79789: no more pending results, returning what we have 11000 1726867142.79792: results queue empty 11000 1726867142.79793: checking for any_errors_fatal 11000 1726867142.79794: done checking for any_errors_fatal 11000 1726867142.79795: checking for max_fail_percentage 11000 1726867142.79796: done checking for max_fail_percentage 11000 1726867142.79797: checking to see if all hosts have failed and the running result is not ok 11000 1726867142.79798: done checking to see if all hosts have failed 11000 1726867142.79798: getting the remaining hosts for this loop 11000 1726867142.79799: done getting the remaining hosts for this loop 11000 1726867142.79802: getting the next task for host managed_node1 11000 1726867142.79808: done getting next task for host managed_node1 11000 1726867142.79810: ^ task is: TASK: Install dnsmasq 11000 1726867142.79813: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867142.79816: getting variables 11000 1726867142.79817: in VariableManager get_vars() 11000 1726867142.79848: Calling all_inventory to load vars for managed_node1 11000 1726867142.79850: Calling groups_inventory to load vars for managed_node1 11000 1726867142.79853: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867142.79861: Calling all_plugins_play to load vars for managed_node1 11000 1726867142.79863: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867142.79866: Calling groups_plugins_play to load vars for managed_node1 11000 1726867142.80199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867142.80391: done with get_vars() 11000 1726867142.80400: done getting variables 11000 1726867142.80452: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:19:02 -0400 (0:00:00.030) 0:00:04.447 ****** 11000 1726867142.80481: entering _queue_task() for managed_node1/package 11000 1726867142.80692: worker is 1 (out of 1 available) 11000 1726867142.80703: exiting _queue_task() for managed_node1/package 11000 1726867142.80715: done queuing things up, now waiting for results queue to drain 11000 1726867142.80716: waiting for pending results... 11000 1726867142.80941: running TaskExecutor() for managed_node1/TASK: Install dnsmasq 11000 1726867142.81045: in run() - task 0affcac9-a3a5-c734-026a-00000000000f 11000 1726867142.81071: variable 'ansible_search_path' from source: unknown 11000 1726867142.81083: variable 'ansible_search_path' from source: unknown 11000 1726867142.81120: calling self._execute() 11000 1726867142.81199: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867142.81211: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867142.81225: variable 'omit' from source: magic vars 11000 1726867142.81570: variable 'ansible_distribution_major_version' from source: facts 11000 1726867142.81591: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867142.81606: variable 'omit' from source: magic vars 11000 1726867142.81657: variable 'omit' from source: magic vars 11000 1726867142.82083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867142.84550: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867142.84623: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867142.84664: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867142.84705: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867142.84742: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867142.84839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867142.84873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867142.84908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867142.84959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867142.84980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867142.85089: variable '__network_is_ostree' from source: set_fact 11000 1726867142.85101: variable 'omit' from source: magic vars 11000 1726867142.85134: variable 'omit' from source: magic vars 11000 1726867142.85168: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867142.85201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867142.85223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867142.85247: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867142.85263: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867142.85303: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867142.85311: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867142.85384: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867142.85424: Set connection var ansible_shell_type to sh 11000 1726867142.85438: Set connection var ansible_pipelining to False 11000 1726867142.85452: Set connection var ansible_shell_executable to /bin/sh 11000 1726867142.85460: Set connection var ansible_connection to ssh 11000 1726867142.85471: Set connection var ansible_timeout to 10 11000 1726867142.85483: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867142.85517: variable 'ansible_shell_executable' from source: unknown 11000 1726867142.85524: variable 'ansible_connection' from source: unknown 11000 1726867142.85532: variable 'ansible_module_compression' from source: unknown 11000 1726867142.85537: variable 'ansible_shell_type' from source: unknown 11000 1726867142.85543: variable 'ansible_shell_executable' from source: unknown 11000 1726867142.85550: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867142.85556: variable 'ansible_pipelining' from source: unknown 11000 1726867142.85562: variable 'ansible_timeout' from source: unknown 11000 1726867142.85583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867142.85671: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867142.85708: variable 'omit' from source: magic vars 11000 1726867142.85711: starting attempt loop 11000 1726867142.85713: running the handler 11000 1726867142.85715: variable 'ansible_facts' from source: unknown 11000 1726867142.85720: variable 'ansible_facts' from source: unknown 11000 1726867142.85783: _low_level_execute_command(): starting 11000 1726867142.85786: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867142.86491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867142.86582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867142.86601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867142.86617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867142.86700: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867142.88815: stdout chunk (state=3): >>>/root <<< 11000 1726867142.88978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867142.88982: stdout chunk (state=3): >>><<< 11000 1726867142.88984: stderr chunk (state=3): >>><<< 11000 1726867142.89009: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867142.89092: _low_level_execute_command(): starting 11000 1726867142.89096: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043 `" && echo ansible-tmp-1726867142.8902824-11201-146458993689043="` echo /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043 `" ) && sleep 0' 11000 1726867142.89955: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867142.89970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867142.89988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867142.90006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867142.90095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867142.90129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867142.90214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867142.92386: stdout chunk (state=3): >>>ansible-tmp-1726867142.8902824-11201-146458993689043=/root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043 <<< 11000 1726867142.92565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867142.92598: stderr chunk (state=3): >>><<< 11000 1726867142.92621: stdout chunk (state=3): >>><<< 11000 1726867142.92673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867142.8902824-11201-146458993689043=/root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867142.92689: variable 'ansible_module_compression' from source: unknown 11000 1726867142.92791: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11000 1726867142.92794: ANSIBALLZ: Acquiring lock 11000 1726867142.92796: ANSIBALLZ: Lock acquired: 139984830862384 11000 1726867142.92798: ANSIBALLZ: Creating module 11000 1726867143.05616: ANSIBALLZ: Writing module into payload 11000 1726867143.05754: ANSIBALLZ: Writing module 11000 1726867143.05771: ANSIBALLZ: Renaming module 11000 1726867143.05776: ANSIBALLZ: Done creating module 11000 1726867143.05794: variable 'ansible_facts' from source: unknown 11000 1726867143.05861: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/AnsiballZ_dnf.py 11000 1726867143.05964: Sending initial data 11000 1726867143.05968: Sent initial data (152 bytes) 11000 1726867143.06364: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867143.06405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867143.06408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867143.06410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867143.06413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867143.06415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867143.06455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867143.06458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867143.06534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867143.08822: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11000 1726867143.08826: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867143.08875: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867143.08925: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmp84tp2xnw /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/AnsiballZ_dnf.py <<< 11000 1726867143.08928: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/AnsiballZ_dnf.py" <<< 11000 1726867143.08970: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmp84tp2xnw" to remote "/root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/AnsiballZ_dnf.py" <<< 11000 1726867143.09658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867143.09692: stderr chunk (state=3): >>><<< 11000 1726867143.09699: stdout chunk (state=3): >>><<< 11000 1726867143.09728: done transferring module to remote 11000 1726867143.09737: _low_level_execute_command(): starting 11000 1726867143.09741: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/ /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/AnsiballZ_dnf.py && sleep 0' 11000 1726867143.10138: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867143.10170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867143.10173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867143.10175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867143.10179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867143.10181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867143.10183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867143.10222: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867143.10239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867143.10294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867143.12843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867143.12865: stderr chunk (state=3): >>><<< 11000 1726867143.12868: stdout chunk (state=3): >>><<< 11000 1726867143.12886: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11000 1726867143.12889: _low_level_execute_command(): starting 11000 1726867143.12896: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/AnsiballZ_dnf.py && sleep 0' 11000 1726867143.13300: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867143.13303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867143.13305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867143.13307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867143.13309: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867143.13354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867143.13357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867143.13431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11000 1726867144.71937: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11000 1726867144.77180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867144.77201: stderr chunk (state=3): >>><<< 11000 1726867144.77204: stdout chunk (state=3): >>><<< 11000 1726867144.77219: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867144.77257: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867144.77274: _low_level_execute_command(): starting 11000 1726867144.77279: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867142.8902824-11201-146458993689043/ > /dev/null 2>&1 && sleep 0' 11000 1726867144.77744: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867144.77747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867144.77750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11000 1726867144.77752: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867144.77754: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867144.77812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867144.77816: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867144.77857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867144.79802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867144.79815: stderr chunk (state=3): >>><<< 11000 1726867144.79822: stdout chunk (state=3): >>><<< 11000 1726867144.79838: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867144.79841: handler run complete 11000 1726867144.79999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867144.80191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867144.80224: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867144.80252: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867144.80279: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867144.80344: variable '__install_status' from source: unknown 11000 1726867144.80359: Evaluated conditional (__install_status is success): True 11000 1726867144.80392: attempt loop complete, returning result 11000 1726867144.80396: _execute() done 11000 1726867144.80398: dumping result to json 11000 1726867144.80402: done dumping result, returning 11000 1726867144.80452: done running TaskExecutor() for managed_node1/TASK: Install dnsmasq [0affcac9-a3a5-c734-026a-00000000000f] 11000 1726867144.80455: sending task result for task 0affcac9-a3a5-c734-026a-00000000000f 11000 1726867144.80581: done sending task result for task 0affcac9-a3a5-c734-026a-00000000000f 11000 1726867144.80584: WORKER PROCESS EXITING changed: [managed_node1] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-3.el10.x86_64" ] } 11000 1726867144.80736: no more pending results, returning what we have 11000 1726867144.80739: results queue empty 11000 1726867144.80740: checking for any_errors_fatal 11000 1726867144.80745: done checking for any_errors_fatal 11000 1726867144.80769: checking for max_fail_percentage 11000 1726867144.80771: done checking for max_fail_percentage 11000 1726867144.80771: checking to see if all hosts have failed and the running result is not ok 11000 1726867144.80772: done checking to see if all hosts have failed 11000 1726867144.80773: getting the remaining hosts for this loop 11000 1726867144.80774: done getting the remaining hosts for this loop 11000 1726867144.80779: getting the next task for host managed_node1 11000 1726867144.80785: done getting next task for host managed_node1 11000 1726867144.80787: ^ task is: TASK: Install pgrep, sysctl 11000 1726867144.80811: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867144.80815: getting variables 11000 1726867144.80817: in VariableManager get_vars() 11000 1726867144.80891: Calling all_inventory to load vars for managed_node1 11000 1726867144.80894: Calling groups_inventory to load vars for managed_node1 11000 1726867144.80896: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867144.80906: Calling all_plugins_play to load vars for managed_node1 11000 1726867144.80909: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867144.80911: Calling groups_plugins_play to load vars for managed_node1 11000 1726867144.81052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867144.81197: done with get_vars() 11000 1726867144.81205: done getting variables 11000 1726867144.81246: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 17:19:04 -0400 (0:00:02.007) 0:00:06.455 ****** 11000 1726867144.81281: entering _queue_task() for managed_node1/package 11000 1726867144.81482: worker is 1 (out of 1 available) 11000 1726867144.81494: exiting _queue_task() for managed_node1/package 11000 1726867144.81507: done queuing things up, now waiting for results queue to drain 11000 1726867144.81508: waiting for pending results... 11000 1726867144.81800: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11000 1726867144.81860: in run() - task 0affcac9-a3a5-c734-026a-000000000010 11000 1726867144.81898: variable 'ansible_search_path' from source: unknown 11000 1726867144.81901: variable 'ansible_search_path' from source: unknown 11000 1726867144.81944: calling self._execute() 11000 1726867144.82100: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867144.82108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867144.82112: variable 'omit' from source: magic vars 11000 1726867144.82533: variable 'ansible_distribution_major_version' from source: facts 11000 1726867144.82536: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867144.82565: variable 'ansible_os_family' from source: facts 11000 1726867144.82580: Evaluated conditional (ansible_os_family == 'RedHat'): True 11000 1726867144.82772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867144.83090: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867144.83132: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867144.83163: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867144.83202: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867144.83275: variable 'ansible_distribution_major_version' from source: facts 11000 1726867144.83281: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11000 1726867144.83332: when evaluation is False, skipping this task 11000 1726867144.83335: _execute() done 11000 1726867144.83337: dumping result to json 11000 1726867144.83340: done dumping result, returning 11000 1726867144.83344: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0affcac9-a3a5-c734-026a-000000000010] 11000 1726867144.83346: sending task result for task 0affcac9-a3a5-c734-026a-000000000010 11000 1726867144.83411: done sending task result for task 0affcac9-a3a5-c734-026a-000000000010 11000 1726867144.83413: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11000 1726867144.83458: no more pending results, returning what we have 11000 1726867144.83461: results queue empty 11000 1726867144.83462: checking for any_errors_fatal 11000 1726867144.83466: done checking for any_errors_fatal 11000 1726867144.83467: checking for max_fail_percentage 11000 1726867144.83468: done checking for max_fail_percentage 11000 1726867144.83469: checking to see if all hosts have failed and the running result is not ok 11000 1726867144.83469: done checking to see if all hosts have failed 11000 1726867144.83470: getting the remaining hosts for this loop 11000 1726867144.83471: done getting the remaining hosts for this loop 11000 1726867144.83474: getting the next task for host managed_node1 11000 1726867144.83480: done getting next task for host managed_node1 11000 1726867144.83482: ^ task is: TASK: Install pgrep, sysctl 11000 1726867144.83484: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867144.83487: getting variables 11000 1726867144.83488: in VariableManager get_vars() 11000 1726867144.83518: Calling all_inventory to load vars for managed_node1 11000 1726867144.83521: Calling groups_inventory to load vars for managed_node1 11000 1726867144.83530: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867144.83537: Calling all_plugins_play to load vars for managed_node1 11000 1726867144.83539: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867144.83541: Calling groups_plugins_play to load vars for managed_node1 11000 1726867144.83673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867144.83809: done with get_vars() 11000 1726867144.83816: done getting variables 11000 1726867144.83858: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 17:19:04 -0400 (0:00:00.025) 0:00:06.481 ****** 11000 1726867144.83876: entering _queue_task() for managed_node1/package 11000 1726867144.84070: worker is 1 (out of 1 available) 11000 1726867144.84082: exiting _queue_task() for managed_node1/package 11000 1726867144.84093: done queuing things up, now waiting for results queue to drain 11000 1726867144.84095: waiting for pending results... 11000 1726867144.84257: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11000 1726867144.84374: in run() - task 0affcac9-a3a5-c734-026a-000000000011 11000 1726867144.84383: variable 'ansible_search_path' from source: unknown 11000 1726867144.84387: variable 'ansible_search_path' from source: unknown 11000 1726867144.84476: calling self._execute() 11000 1726867144.84481: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867144.84487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867144.84496: variable 'omit' from source: magic vars 11000 1726867144.84857: variable 'ansible_distribution_major_version' from source: facts 11000 1726867144.84867: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867144.84997: variable 'ansible_os_family' from source: facts 11000 1726867144.85000: Evaluated conditional (ansible_os_family == 'RedHat'): True 11000 1726867144.85137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867144.85368: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867144.85418: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867144.85453: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867144.85480: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867144.85538: variable 'ansible_distribution_major_version' from source: facts 11000 1726867144.85550: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11000 1726867144.85625: variable 'omit' from source: magic vars 11000 1726867144.85628: variable 'omit' from source: magic vars 11000 1726867144.85739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867144.88355: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867144.88391: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867144.88432: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867144.88485: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867144.88516: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867144.88598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867144.88618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867144.88635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867144.88660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867144.88671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867144.88744: variable '__network_is_ostree' from source: set_fact 11000 1726867144.88748: variable 'omit' from source: magic vars 11000 1726867144.88768: variable 'omit' from source: magic vars 11000 1726867144.88794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867144.88827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867144.88836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867144.88849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867144.88857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867144.88880: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867144.88883: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867144.88885: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867144.88954: Set connection var ansible_shell_type to sh 11000 1726867144.88960: Set connection var ansible_pipelining to False 11000 1726867144.88967: Set connection var ansible_shell_executable to /bin/sh 11000 1726867144.88970: Set connection var ansible_connection to ssh 11000 1726867144.88975: Set connection var ansible_timeout to 10 11000 1726867144.88984: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867144.89010: variable 'ansible_shell_executable' from source: unknown 11000 1726867144.89013: variable 'ansible_connection' from source: unknown 11000 1726867144.89017: variable 'ansible_module_compression' from source: unknown 11000 1726867144.89020: variable 'ansible_shell_type' from source: unknown 11000 1726867144.89022: variable 'ansible_shell_executable' from source: unknown 11000 1726867144.89024: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867144.89026: variable 'ansible_pipelining' from source: unknown 11000 1726867144.89028: variable 'ansible_timeout' from source: unknown 11000 1726867144.89030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867144.89098: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867144.89107: variable 'omit' from source: magic vars 11000 1726867144.89110: starting attempt loop 11000 1726867144.89114: running the handler 11000 1726867144.89122: variable 'ansible_facts' from source: unknown 11000 1726867144.89125: variable 'ansible_facts' from source: unknown 11000 1726867144.89150: _low_level_execute_command(): starting 11000 1726867144.89156: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867144.89693: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867144.89696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867144.89753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867144.89776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867144.89819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867144.91465: stdout chunk (state=3): >>>/root <<< 11000 1726867144.91562: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867144.91587: stderr chunk (state=3): >>><<< 11000 1726867144.91593: stdout chunk (state=3): >>><<< 11000 1726867144.91610: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867144.91665: _low_level_execute_command(): starting 11000 1726867144.91669: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134 `" && echo ansible-tmp-1726867144.916099-11297-77911301326134="` echo /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134 `" ) && sleep 0' 11000 1726867144.92105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867144.92112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867144.92114: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867144.92116: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867144.92204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867144.92248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867144.94150: stdout chunk (state=3): >>>ansible-tmp-1726867144.916099-11297-77911301326134=/root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134 <<< 11000 1726867144.94289: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867144.94310: stderr chunk (state=3): >>><<< 11000 1726867144.94322: stdout chunk (state=3): >>><<< 11000 1726867144.94350: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867144.916099-11297-77911301326134=/root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867144.94369: variable 'ansible_module_compression' from source: unknown 11000 1726867144.94417: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11000 1726867144.94450: variable 'ansible_facts' from source: unknown 11000 1726867144.94532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/AnsiballZ_dnf.py 11000 1726867144.94621: Sending initial data 11000 1726867144.94624: Sent initial data (150 bytes) 11000 1726867144.95045: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867144.95048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867144.95051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867144.95053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867144.95055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867144.95107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867144.95113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867144.95157: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867144.96720: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11000 1726867144.96727: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867144.96762: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867144.96805: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmphgpnu640 /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/AnsiballZ_dnf.py <<< 11000 1726867144.96811: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/AnsiballZ_dnf.py" <<< 11000 1726867144.96852: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmphgpnu640" to remote "/root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/AnsiballZ_dnf.py" <<< 11000 1726867144.96857: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/AnsiballZ_dnf.py" <<< 11000 1726867144.97508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867144.97542: stderr chunk (state=3): >>><<< 11000 1726867144.97545: stdout chunk (state=3): >>><<< 11000 1726867144.97573: done transferring module to remote 11000 1726867144.97583: _low_level_execute_command(): starting 11000 1726867144.97587: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/ /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/AnsiballZ_dnf.py && sleep 0' 11000 1726867144.98000: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867144.98003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867144.98005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867144.98007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867144.98009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867144.98054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867144.98057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867144.98109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867144.99840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867144.99862: stderr chunk (state=3): >>><<< 11000 1726867144.99865: stdout chunk (state=3): >>><<< 11000 1726867144.99876: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867144.99880: _low_level_execute_command(): starting 11000 1726867144.99883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/AnsiballZ_dnf.py && sleep 0' 11000 1726867145.00272: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.00278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.00280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867145.00282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867145.00284: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.00329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867145.00332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867145.00385: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867145.41720: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11000 1726867145.45822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867145.45848: stderr chunk (state=3): >>><<< 11000 1726867145.45851: stdout chunk (state=3): >>><<< 11000 1726867145.45864: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867145.45905: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867145.45912: _low_level_execute_command(): starting 11000 1726867145.45917: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867144.916099-11297-77911301326134/ > /dev/null 2>&1 && sleep 0' 11000 1726867145.46343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.46346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.46348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.46350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.46408: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867145.46411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867145.46453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867145.48365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867145.48369: stdout chunk (state=3): >>><<< 11000 1726867145.48371: stderr chunk (state=3): >>><<< 11000 1726867145.48397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867145.48505: handler run complete 11000 1726867145.48508: attempt loop complete, returning result 11000 1726867145.48511: _execute() done 11000 1726867145.48513: dumping result to json 11000 1726867145.48515: done dumping result, returning 11000 1726867145.48517: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [0affcac9-a3a5-c734-026a-000000000011] 11000 1726867145.48519: sending task result for task 0affcac9-a3a5-c734-026a-000000000011 11000 1726867145.48696: done sending task result for task 0affcac9-a3a5-c734-026a-000000000011 11000 1726867145.48699: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11000 1726867145.48784: no more pending results, returning what we have 11000 1726867145.48788: results queue empty 11000 1726867145.48789: checking for any_errors_fatal 11000 1726867145.48796: done checking for any_errors_fatal 11000 1726867145.48797: checking for max_fail_percentage 11000 1726867145.48799: done checking for max_fail_percentage 11000 1726867145.48799: checking to see if all hosts have failed and the running result is not ok 11000 1726867145.48800: done checking to see if all hosts have failed 11000 1726867145.48801: getting the remaining hosts for this loop 11000 1726867145.48802: done getting the remaining hosts for this loop 11000 1726867145.48806: getting the next task for host managed_node1 11000 1726867145.48813: done getting next task for host managed_node1 11000 1726867145.48816: ^ task is: TASK: Create test interfaces 11000 1726867145.48819: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867145.48823: getting variables 11000 1726867145.48825: in VariableManager get_vars() 11000 1726867145.48867: Calling all_inventory to load vars for managed_node1 11000 1726867145.48870: Calling groups_inventory to load vars for managed_node1 11000 1726867145.48873: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867145.49004: Calling all_plugins_play to load vars for managed_node1 11000 1726867145.49008: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867145.49012: Calling groups_plugins_play to load vars for managed_node1 11000 1726867145.49400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867145.49607: done with get_vars() 11000 1726867145.49619: done getting variables 11000 1726867145.49717: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 17:19:05 -0400 (0:00:00.658) 0:00:07.140 ****** 11000 1726867145.49746: entering _queue_task() for managed_node1/shell 11000 1726867145.49748: Creating lock for shell 11000 1726867145.50268: worker is 1 (out of 1 available) 11000 1726867145.50282: exiting _queue_task() for managed_node1/shell 11000 1726867145.50516: done queuing things up, now waiting for results queue to drain 11000 1726867145.50518: waiting for pending results... 11000 1726867145.50858: running TaskExecutor() for managed_node1/TASK: Create test interfaces 11000 1726867145.51026: in run() - task 0affcac9-a3a5-c734-026a-000000000012 11000 1726867145.51065: variable 'ansible_search_path' from source: unknown 11000 1726867145.51074: variable 'ansible_search_path' from source: unknown 11000 1726867145.51115: calling self._execute() 11000 1726867145.51205: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867145.51253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867145.51271: variable 'omit' from source: magic vars 11000 1726867145.52148: variable 'ansible_distribution_major_version' from source: facts 11000 1726867145.52157: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867145.52171: variable 'omit' from source: magic vars 11000 1726867145.52281: variable 'omit' from source: magic vars 11000 1726867145.52915: variable 'dhcp_interface1' from source: play vars 11000 1726867145.52919: variable 'dhcp_interface2' from source: play vars 11000 1726867145.52944: variable 'omit' from source: magic vars 11000 1726867145.52982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867145.53011: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867145.53026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867145.53039: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867145.53048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867145.53070: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867145.53073: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867145.53075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867145.53148: Set connection var ansible_shell_type to sh 11000 1726867145.53155: Set connection var ansible_pipelining to False 11000 1726867145.53162: Set connection var ansible_shell_executable to /bin/sh 11000 1726867145.53165: Set connection var ansible_connection to ssh 11000 1726867145.53170: Set connection var ansible_timeout to 10 11000 1726867145.53175: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867145.53198: variable 'ansible_shell_executable' from source: unknown 11000 1726867145.53201: variable 'ansible_connection' from source: unknown 11000 1726867145.53205: variable 'ansible_module_compression' from source: unknown 11000 1726867145.53208: variable 'ansible_shell_type' from source: unknown 11000 1726867145.53210: variable 'ansible_shell_executable' from source: unknown 11000 1726867145.53212: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867145.53214: variable 'ansible_pipelining' from source: unknown 11000 1726867145.53216: variable 'ansible_timeout' from source: unknown 11000 1726867145.53219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867145.53316: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867145.53324: variable 'omit' from source: magic vars 11000 1726867145.53334: starting attempt loop 11000 1726867145.53337: running the handler 11000 1726867145.53342: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867145.53357: _low_level_execute_command(): starting 11000 1726867145.53363: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867145.53830: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.53833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.53836: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.53839: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.53890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867145.53903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867145.53962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867145.55606: stdout chunk (state=3): >>>/root <<< 11000 1726867145.55754: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867145.55759: stdout chunk (state=3): >>><<< 11000 1726867145.55762: stderr chunk (state=3): >>><<< 11000 1726867145.55787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867145.55876: _low_level_execute_command(): starting 11000 1726867145.55881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220 `" && echo ansible-tmp-1726867145.5579355-11330-66522545239220="` echo /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220 `" ) && sleep 0' 11000 1726867145.56446: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.56466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.56518: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867145.56521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867145.56571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867145.58438: stdout chunk (state=3): >>>ansible-tmp-1726867145.5579355-11330-66522545239220=/root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220 <<< 11000 1726867145.58683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867145.58689: stdout chunk (state=3): >>><<< 11000 1726867145.58692: stderr chunk (state=3): >>><<< 11000 1726867145.58695: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867145.5579355-11330-66522545239220=/root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867145.58697: variable 'ansible_module_compression' from source: unknown 11000 1726867145.58699: ANSIBALLZ: Using generic lock for ansible.legacy.command 11000 1726867145.58701: ANSIBALLZ: Acquiring lock 11000 1726867145.58703: ANSIBALLZ: Lock acquired: 139984830862384 11000 1726867145.58718: ANSIBALLZ: Creating module 11000 1726867145.67891: ANSIBALLZ: Writing module into payload 11000 1726867145.67950: ANSIBALLZ: Writing module 11000 1726867145.67966: ANSIBALLZ: Renaming module 11000 1726867145.67978: ANSIBALLZ: Done creating module 11000 1726867145.67992: variable 'ansible_facts' from source: unknown 11000 1726867145.68041: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/AnsiballZ_command.py 11000 1726867145.68134: Sending initial data 11000 1726867145.68137: Sent initial data (155 bytes) 11000 1726867145.68580: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867145.68583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867145.68588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.68591: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.68593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.68643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867145.68646: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867145.68650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867145.68704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867145.70348: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867145.70405: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867145.70450: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpb0h87ior /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/AnsiballZ_command.py <<< 11000 1726867145.70461: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/AnsiballZ_command.py" <<< 11000 1726867145.70517: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpb0h87ior" to remote "/root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/AnsiballZ_command.py" <<< 11000 1726867145.71214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867145.71330: stderr chunk (state=3): >>><<< 11000 1726867145.71333: stdout chunk (state=3): >>><<< 11000 1726867145.71335: done transferring module to remote 11000 1726867145.71337: _low_level_execute_command(): starting 11000 1726867145.71340: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/ /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/AnsiballZ_command.py && sleep 0' 11000 1726867145.71866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.71869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867145.71875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.71879: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.71882: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.71934: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867145.71940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867145.71942: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867145.71988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867145.73723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867145.73746: stderr chunk (state=3): >>><<< 11000 1726867145.73750: stdout chunk (state=3): >>><<< 11000 1726867145.73768: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867145.73771: _low_level_execute_command(): starting 11000 1726867145.73773: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/AnsiballZ_command.py && sleep 0' 11000 1726867145.74210: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867145.74214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867145.74216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.74228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867145.74283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867145.74287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867145.74341: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.12130: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:19:05.892642", "end": "2024-09-20 17:19:07.114193", "delta": "0:00:01.221551", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867147.13292: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867147.13296: stderr chunk (state=3): >>><<< 11000 1726867147.13298: stdout chunk (state=3): >>><<< 11000 1726867147.13329: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 700 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 17:19:05.892642", "end": "2024-09-20 17:19:07.114193", "delta": "0:00:01.221551", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867147.13383: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867147.13582: _low_level_execute_command(): starting 11000 1726867147.13586: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867145.5579355-11330-66522545239220/ > /dev/null 2>&1 && sleep 0' 11000 1726867147.14586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.14599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867147.14650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867147.14663: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867147.14700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.14728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867147.14739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.14861: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.14927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.16775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.16826: stderr chunk (state=3): >>><<< 11000 1726867147.16829: stdout chunk (state=3): >>><<< 11000 1726867147.16847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.16860: handler run complete 11000 1726867147.16922: Evaluated conditional (False): False 11000 1726867147.16940: attempt loop complete, returning result 11000 1726867147.16948: _execute() done 11000 1726867147.16967: dumping result to json 11000 1726867147.17197: done dumping result, returning 11000 1726867147.17199: done running TaskExecutor() for managed_node1/TASK: Create test interfaces [0affcac9-a3a5-c734-026a-000000000012] 11000 1726867147.17201: sending task result for task 0affcac9-a3a5-c734-026a-000000000012 11000 1726867147.17269: done sending task result for task 0affcac9-a3a5-c734-026a-000000000012 11000 1726867147.17272: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.221551", "end": "2024-09-20 17:19:07.114193", "rc": 0, "start": "2024-09-20 17:19:05.892642" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 700 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 700 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11000 1726867147.17446: no more pending results, returning what we have 11000 1726867147.17449: results queue empty 11000 1726867147.17450: checking for any_errors_fatal 11000 1726867147.17456: done checking for any_errors_fatal 11000 1726867147.17457: checking for max_fail_percentage 11000 1726867147.17459: done checking for max_fail_percentage 11000 1726867147.17459: checking to see if all hosts have failed and the running result is not ok 11000 1726867147.17460: done checking to see if all hosts have failed 11000 1726867147.17461: getting the remaining hosts for this loop 11000 1726867147.17462: done getting the remaining hosts for this loop 11000 1726867147.17465: getting the next task for host managed_node1 11000 1726867147.17473: done getting next task for host managed_node1 11000 1726867147.17475: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11000 1726867147.17480: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867147.17484: getting variables 11000 1726867147.17485: in VariableManager get_vars() 11000 1726867147.17524: Calling all_inventory to load vars for managed_node1 11000 1726867147.17527: Calling groups_inventory to load vars for managed_node1 11000 1726867147.17530: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867147.17542: Calling all_plugins_play to load vars for managed_node1 11000 1726867147.17545: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867147.17548: Calling groups_plugins_play to load vars for managed_node1 11000 1726867147.18268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867147.18661: done with get_vars() 11000 1726867147.18671: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:07 -0400 (0:00:01.692) 0:00:08.832 ****** 11000 1726867147.18967: entering _queue_task() for managed_node1/include_tasks 11000 1726867147.19412: worker is 1 (out of 1 available) 11000 1726867147.19425: exiting _queue_task() for managed_node1/include_tasks 11000 1726867147.19436: done queuing things up, now waiting for results queue to drain 11000 1726867147.19437: waiting for pending results... 11000 1726867147.20033: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11000 1726867147.20096: in run() - task 0affcac9-a3a5-c734-026a-000000000016 11000 1726867147.20116: variable 'ansible_search_path' from source: unknown 11000 1726867147.20134: variable 'ansible_search_path' from source: unknown 11000 1726867147.20582: calling self._execute() 11000 1726867147.20588: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.20591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.20593: variable 'omit' from source: magic vars 11000 1726867147.21195: variable 'ansible_distribution_major_version' from source: facts 11000 1726867147.21212: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867147.21223: _execute() done 11000 1726867147.21230: dumping result to json 11000 1726867147.21238: done dumping result, returning 11000 1726867147.21247: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-c734-026a-000000000016] 11000 1726867147.21256: sending task result for task 0affcac9-a3a5-c734-026a-000000000016 11000 1726867147.21483: no more pending results, returning what we have 11000 1726867147.21488: in VariableManager get_vars() 11000 1726867147.21530: Calling all_inventory to load vars for managed_node1 11000 1726867147.21533: Calling groups_inventory to load vars for managed_node1 11000 1726867147.21535: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867147.21547: Calling all_plugins_play to load vars for managed_node1 11000 1726867147.21549: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867147.21551: Calling groups_plugins_play to load vars for managed_node1 11000 1726867147.21926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867147.22313: done with get_vars() 11000 1726867147.22321: variable 'ansible_search_path' from source: unknown 11000 1726867147.22322: variable 'ansible_search_path' from source: unknown 11000 1726867147.22336: done sending task result for task 0affcac9-a3a5-c734-026a-000000000016 11000 1726867147.22340: WORKER PROCESS EXITING 11000 1726867147.22368: we have included files to process 11000 1726867147.22369: generating all_blocks data 11000 1726867147.22370: done generating all_blocks data 11000 1726867147.22371: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867147.22372: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867147.22374: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867147.22759: done processing included file 11000 1726867147.22760: iterating over new_blocks loaded from include file 11000 1726867147.22762: in VariableManager get_vars() 11000 1726867147.22808: done with get_vars() 11000 1726867147.22810: filtering new block on tags 11000 1726867147.22827: done filtering new block on tags 11000 1726867147.22829: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11000 1726867147.22835: extending task lists for all hosts with included blocks 11000 1726867147.22938: done extending task lists 11000 1726867147.22940: done processing included files 11000 1726867147.22940: results queue empty 11000 1726867147.22941: checking for any_errors_fatal 11000 1726867147.22946: done checking for any_errors_fatal 11000 1726867147.22947: checking for max_fail_percentage 11000 1726867147.22948: done checking for max_fail_percentage 11000 1726867147.22948: checking to see if all hosts have failed and the running result is not ok 11000 1726867147.22949: done checking to see if all hosts have failed 11000 1726867147.22950: getting the remaining hosts for this loop 11000 1726867147.22951: done getting the remaining hosts for this loop 11000 1726867147.22953: getting the next task for host managed_node1 11000 1726867147.22958: done getting next task for host managed_node1 11000 1726867147.22959: ^ task is: TASK: Get stat for interface {{ interface }} 11000 1726867147.22962: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867147.22964: getting variables 11000 1726867147.22965: in VariableManager get_vars() 11000 1726867147.22979: Calling all_inventory to load vars for managed_node1 11000 1726867147.22981: Calling groups_inventory to load vars for managed_node1 11000 1726867147.22983: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867147.22993: Calling all_plugins_play to load vars for managed_node1 11000 1726867147.22995: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867147.22998: Calling groups_plugins_play to load vars for managed_node1 11000 1726867147.23135: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867147.23333: done with get_vars() 11000 1726867147.23341: done getting variables 11000 1726867147.23521: variable 'interface' from source: task vars 11000 1726867147.23526: variable 'dhcp_interface1' from source: play vars 11000 1726867147.23596: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:07 -0400 (0:00:00.046) 0:00:08.879 ****** 11000 1726867147.23634: entering _queue_task() for managed_node1/stat 11000 1726867147.23905: worker is 1 (out of 1 available) 11000 1726867147.23922: exiting _queue_task() for managed_node1/stat 11000 1726867147.23935: done queuing things up, now waiting for results queue to drain 11000 1726867147.23936: waiting for pending results... 11000 1726867147.24285: running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 11000 1726867147.24504: in run() - task 0affcac9-a3a5-c734-026a-000000000153 11000 1726867147.24796: variable 'ansible_search_path' from source: unknown 11000 1726867147.24799: variable 'ansible_search_path' from source: unknown 11000 1726867147.24802: calling self._execute() 11000 1726867147.24846: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.24858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.24872: variable 'omit' from source: magic vars 11000 1726867147.25628: variable 'ansible_distribution_major_version' from source: facts 11000 1726867147.25649: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867147.25663: variable 'omit' from source: magic vars 11000 1726867147.25721: variable 'omit' from source: magic vars 11000 1726867147.25976: variable 'interface' from source: task vars 11000 1726867147.25993: variable 'dhcp_interface1' from source: play vars 11000 1726867147.26058: variable 'dhcp_interface1' from source: play vars 11000 1726867147.26282: variable 'omit' from source: magic vars 11000 1726867147.26288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867147.26292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867147.26401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867147.26425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867147.26444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867147.26480: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867147.26683: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.26689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.26794: Set connection var ansible_shell_type to sh 11000 1726867147.26809: Set connection var ansible_pipelining to False 11000 1726867147.26823: Set connection var ansible_shell_executable to /bin/sh 11000 1726867147.26830: Set connection var ansible_connection to ssh 11000 1726867147.26843: Set connection var ansible_timeout to 10 11000 1726867147.26854: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867147.26890: variable 'ansible_shell_executable' from source: unknown 11000 1726867147.26901: variable 'ansible_connection' from source: unknown 11000 1726867147.27084: variable 'ansible_module_compression' from source: unknown 11000 1726867147.27091: variable 'ansible_shell_type' from source: unknown 11000 1726867147.27094: variable 'ansible_shell_executable' from source: unknown 11000 1726867147.27096: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.27099: variable 'ansible_pipelining' from source: unknown 11000 1726867147.27102: variable 'ansible_timeout' from source: unknown 11000 1726867147.27104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.27394: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867147.27409: variable 'omit' from source: magic vars 11000 1726867147.27417: starting attempt loop 11000 1726867147.27423: running the handler 11000 1726867147.27439: _low_level_execute_command(): starting 11000 1726867147.27450: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867147.28971: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867147.29222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.29267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.30926: stdout chunk (state=3): >>>/root <<< 11000 1726867147.31076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.31082: stdout chunk (state=3): >>><<< 11000 1726867147.31085: stderr chunk (state=3): >>><<< 11000 1726867147.31107: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.31127: _low_level_execute_command(): starting 11000 1726867147.31147: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607 `" && echo ansible-tmp-1726867147.3111465-11399-254750875665607="` echo /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607 `" ) && sleep 0' 11000 1726867147.32347: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.32350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867147.32354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.32357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.32365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867147.32367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.32604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.32635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.34538: stdout chunk (state=3): >>>ansible-tmp-1726867147.3111465-11399-254750875665607=/root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607 <<< 11000 1726867147.34998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.35025: stderr chunk (state=3): >>><<< 11000 1726867147.35074: stdout chunk (state=3): >>><<< 11000 1726867147.35104: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867147.3111465-11399-254750875665607=/root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.35284: variable 'ansible_module_compression' from source: unknown 11000 1726867147.35326: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11000 1726867147.35364: variable 'ansible_facts' from source: unknown 11000 1726867147.35603: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/AnsiballZ_stat.py 11000 1726867147.35844: Sending initial data 11000 1726867147.35852: Sent initial data (153 bytes) 11000 1726867147.37027: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.37145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.37149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.37233: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.37406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.38973: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 11000 1726867147.39003: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867147.39037: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867147.39141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpk94a90px /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/AnsiballZ_stat.py <<< 11000 1726867147.39145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpk94a90px" to remote "/root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/AnsiballZ_stat.py" <<< 11000 1726867147.40389: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.40485: stderr chunk (state=3): >>><<< 11000 1726867147.40491: stdout chunk (state=3): >>><<< 11000 1726867147.40494: done transferring module to remote 11000 1726867147.40500: _low_level_execute_command(): starting 11000 1726867147.40510: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/ /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/AnsiballZ_stat.py && sleep 0' 11000 1726867147.42301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.42329: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867147.42355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.42372: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.42456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.44264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.44280: stdout chunk (state=3): >>><<< 11000 1726867147.44297: stderr chunk (state=3): >>><<< 11000 1726867147.44318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.44327: _low_level_execute_command(): starting 11000 1726867147.44337: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/AnsiballZ_stat.py && sleep 0' 11000 1726867147.45587: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.45590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867147.45593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.45596: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867147.45599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.45761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867147.45772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.45802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.45876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.61037: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26353, "dev": 23, "nlink": 1, "atime": 1726867145.8993676, "mtime": 1726867145.8993676, "ctime": 1726867145.8993676, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11000 1726867147.62394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867147.62398: stdout chunk (state=3): >>><<< 11000 1726867147.62400: stderr chunk (state=3): >>><<< 11000 1726867147.62484: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26353, "dev": 23, "nlink": 1, "atime": 1726867145.8993676, "mtime": 1726867145.8993676, "ctime": 1726867145.8993676, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867147.62674: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867147.62693: _low_level_execute_command(): starting 11000 1726867147.62701: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867147.3111465-11399-254750875665607/ > /dev/null 2>&1 && sleep 0' 11000 1726867147.63933: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867147.63975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.64095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867147.64200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867147.64322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.64361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.66260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.66270: stdout chunk (state=3): >>><<< 11000 1726867147.66284: stderr chunk (state=3): >>><<< 11000 1726867147.66309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.66429: handler run complete 11000 1726867147.66489: attempt loop complete, returning result 11000 1726867147.66498: _execute() done 11000 1726867147.66544: dumping result to json 11000 1726867147.66593: done dumping result, returning 11000 1726867147.66597: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 [0affcac9-a3a5-c734-026a-000000000153] 11000 1726867147.66599: sending task result for task 0affcac9-a3a5-c734-026a-000000000153 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726867145.8993676, "block_size": 4096, "blocks": 0, "ctime": 1726867145.8993676, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26353, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726867145.8993676, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11000 1726867147.67192: no more pending results, returning what we have 11000 1726867147.67196: results queue empty 11000 1726867147.67197: checking for any_errors_fatal 11000 1726867147.67198: done checking for any_errors_fatal 11000 1726867147.67198: checking for max_fail_percentage 11000 1726867147.67200: done checking for max_fail_percentage 11000 1726867147.67201: checking to see if all hosts have failed and the running result is not ok 11000 1726867147.67202: done checking to see if all hosts have failed 11000 1726867147.67203: getting the remaining hosts for this loop 11000 1726867147.67204: done getting the remaining hosts for this loop 11000 1726867147.67208: getting the next task for host managed_node1 11000 1726867147.67216: done getting next task for host managed_node1 11000 1726867147.67219: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11000 1726867147.67222: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867147.67227: getting variables 11000 1726867147.67228: in VariableManager get_vars() 11000 1726867147.67289: Calling all_inventory to load vars for managed_node1 11000 1726867147.67293: Calling groups_inventory to load vars for managed_node1 11000 1726867147.67295: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867147.67308: Calling all_plugins_play to load vars for managed_node1 11000 1726867147.67311: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867147.67316: Calling groups_plugins_play to load vars for managed_node1 11000 1726867147.67774: done sending task result for task 0affcac9-a3a5-c734-026a-000000000153 11000 1726867147.67780: WORKER PROCESS EXITING 11000 1726867147.67808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867147.68013: done with get_vars() 11000 1726867147.68024: done getting variables 11000 1726867147.68124: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11000 1726867147.68243: variable 'interface' from source: task vars 11000 1726867147.68248: variable 'dhcp_interface1' from source: play vars 11000 1726867147.68310: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:07 -0400 (0:00:00.447) 0:00:09.326 ****** 11000 1726867147.68346: entering _queue_task() for managed_node1/assert 11000 1726867147.68348: Creating lock for assert 11000 1726867147.68601: worker is 1 (out of 1 available) 11000 1726867147.68613: exiting _queue_task() for managed_node1/assert 11000 1726867147.68625: done queuing things up, now waiting for results queue to drain 11000 1726867147.68627: waiting for pending results... 11000 1726867147.68899: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' 11000 1726867147.69013: in run() - task 0affcac9-a3a5-c734-026a-000000000017 11000 1726867147.69030: variable 'ansible_search_path' from source: unknown 11000 1726867147.69036: variable 'ansible_search_path' from source: unknown 11000 1726867147.69070: calling self._execute() 11000 1726867147.69154: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.69165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.69179: variable 'omit' from source: magic vars 11000 1726867147.69523: variable 'ansible_distribution_major_version' from source: facts 11000 1726867147.69547: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867147.69558: variable 'omit' from source: magic vars 11000 1726867147.69606: variable 'omit' from source: magic vars 11000 1726867147.69707: variable 'interface' from source: task vars 11000 1726867147.69718: variable 'dhcp_interface1' from source: play vars 11000 1726867147.69790: variable 'dhcp_interface1' from source: play vars 11000 1726867147.69813: variable 'omit' from source: magic vars 11000 1726867147.69855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867147.69902: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867147.69925: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867147.69947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867147.69964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867147.70009: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867147.70018: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.70026: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.70131: Set connection var ansible_shell_type to sh 11000 1726867147.70144: Set connection var ansible_pipelining to False 11000 1726867147.70156: Set connection var ansible_shell_executable to /bin/sh 11000 1726867147.70162: Set connection var ansible_connection to ssh 11000 1726867147.70171: Set connection var ansible_timeout to 10 11000 1726867147.70183: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867147.70226: variable 'ansible_shell_executable' from source: unknown 11000 1726867147.70237: variable 'ansible_connection' from source: unknown 11000 1726867147.70245: variable 'ansible_module_compression' from source: unknown 11000 1726867147.70282: variable 'ansible_shell_type' from source: unknown 11000 1726867147.70285: variable 'ansible_shell_executable' from source: unknown 11000 1726867147.70288: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.70290: variable 'ansible_pipelining' from source: unknown 11000 1726867147.70292: variable 'ansible_timeout' from source: unknown 11000 1726867147.70302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.70433: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867147.70448: variable 'omit' from source: magic vars 11000 1726867147.70482: starting attempt loop 11000 1726867147.70485: running the handler 11000 1726867147.70601: variable 'interface_stat' from source: set_fact 11000 1726867147.70631: Evaluated conditional (interface_stat.stat.exists): True 11000 1726867147.70641: handler run complete 11000 1726867147.70681: attempt loop complete, returning result 11000 1726867147.70685: _execute() done 11000 1726867147.70687: dumping result to json 11000 1726867147.70689: done dumping result, returning 11000 1726867147.70692: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' [0affcac9-a3a5-c734-026a-000000000017] 11000 1726867147.70694: sending task result for task 0affcac9-a3a5-c734-026a-000000000017 11000 1726867147.70896: done sending task result for task 0affcac9-a3a5-c734-026a-000000000017 11000 1726867147.70900: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867147.70946: no more pending results, returning what we have 11000 1726867147.70954: results queue empty 11000 1726867147.70955: checking for any_errors_fatal 11000 1726867147.70965: done checking for any_errors_fatal 11000 1726867147.70965: checking for max_fail_percentage 11000 1726867147.70967: done checking for max_fail_percentage 11000 1726867147.70968: checking to see if all hosts have failed and the running result is not ok 11000 1726867147.70969: done checking to see if all hosts have failed 11000 1726867147.70970: getting the remaining hosts for this loop 11000 1726867147.70971: done getting the remaining hosts for this loop 11000 1726867147.70974: getting the next task for host managed_node1 11000 1726867147.70985: done getting next task for host managed_node1 11000 1726867147.70988: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11000 1726867147.70991: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867147.70995: getting variables 11000 1726867147.70996: in VariableManager get_vars() 11000 1726867147.71034: Calling all_inventory to load vars for managed_node1 11000 1726867147.71037: Calling groups_inventory to load vars for managed_node1 11000 1726867147.71039: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867147.71050: Calling all_plugins_play to load vars for managed_node1 11000 1726867147.71053: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867147.71056: Calling groups_plugins_play to load vars for managed_node1 11000 1726867147.71350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867147.71555: done with get_vars() 11000 1726867147.71564: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:07 -0400 (0:00:00.033) 0:00:09.359 ****** 11000 1726867147.71659: entering _queue_task() for managed_node1/include_tasks 11000 1726867147.71947: worker is 1 (out of 1 available) 11000 1726867147.71956: exiting _queue_task() for managed_node1/include_tasks 11000 1726867147.71967: done queuing things up, now waiting for results queue to drain 11000 1726867147.71968: waiting for pending results... 11000 1726867147.72175: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11000 1726867147.72240: in run() - task 0affcac9-a3a5-c734-026a-00000000001b 11000 1726867147.72258: variable 'ansible_search_path' from source: unknown 11000 1726867147.72270: variable 'ansible_search_path' from source: unknown 11000 1726867147.72310: calling self._execute() 11000 1726867147.72387: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.72398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.72409: variable 'omit' from source: magic vars 11000 1726867147.72818: variable 'ansible_distribution_major_version' from source: facts 11000 1726867147.72821: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867147.72823: _execute() done 11000 1726867147.72827: dumping result to json 11000 1726867147.72835: done dumping result, returning 11000 1726867147.72843: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-c734-026a-00000000001b] 11000 1726867147.72882: sending task result for task 0affcac9-a3a5-c734-026a-00000000001b 11000 1726867147.73105: done sending task result for task 0affcac9-a3a5-c734-026a-00000000001b 11000 1726867147.73107: WORKER PROCESS EXITING 11000 1726867147.73127: no more pending results, returning what we have 11000 1726867147.73130: in VariableManager get_vars() 11000 1726867147.73162: Calling all_inventory to load vars for managed_node1 11000 1726867147.73164: Calling groups_inventory to load vars for managed_node1 11000 1726867147.73166: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867147.73173: Calling all_plugins_play to load vars for managed_node1 11000 1726867147.73175: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867147.73180: Calling groups_plugins_play to load vars for managed_node1 11000 1726867147.73405: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867147.73600: done with get_vars() 11000 1726867147.73607: variable 'ansible_search_path' from source: unknown 11000 1726867147.73608: variable 'ansible_search_path' from source: unknown 11000 1726867147.73642: we have included files to process 11000 1726867147.73643: generating all_blocks data 11000 1726867147.73645: done generating all_blocks data 11000 1726867147.73648: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867147.73649: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867147.73656: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867147.73820: done processing included file 11000 1726867147.73822: iterating over new_blocks loaded from include file 11000 1726867147.73824: in VariableManager get_vars() 11000 1726867147.73841: done with get_vars() 11000 1726867147.73842: filtering new block on tags 11000 1726867147.73857: done filtering new block on tags 11000 1726867147.73859: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11000 1726867147.73863: extending task lists for all hosts with included blocks 11000 1726867147.73965: done extending task lists 11000 1726867147.73966: done processing included files 11000 1726867147.73967: results queue empty 11000 1726867147.73968: checking for any_errors_fatal 11000 1726867147.73970: done checking for any_errors_fatal 11000 1726867147.73970: checking for max_fail_percentage 11000 1726867147.73971: done checking for max_fail_percentage 11000 1726867147.73972: checking to see if all hosts have failed and the running result is not ok 11000 1726867147.73973: done checking to see if all hosts have failed 11000 1726867147.73973: getting the remaining hosts for this loop 11000 1726867147.73974: done getting the remaining hosts for this loop 11000 1726867147.73981: getting the next task for host managed_node1 11000 1726867147.73985: done getting next task for host managed_node1 11000 1726867147.73987: ^ task is: TASK: Get stat for interface {{ interface }} 11000 1726867147.73990: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867147.73992: getting variables 11000 1726867147.73993: in VariableManager get_vars() 11000 1726867147.74005: Calling all_inventory to load vars for managed_node1 11000 1726867147.74007: Calling groups_inventory to load vars for managed_node1 11000 1726867147.74009: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867147.74014: Calling all_plugins_play to load vars for managed_node1 11000 1726867147.74016: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867147.74019: Calling groups_plugins_play to load vars for managed_node1 11000 1726867147.74155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867147.74555: done with get_vars() 11000 1726867147.74563: done getting variables 11000 1726867147.74701: variable 'interface' from source: task vars 11000 1726867147.74705: variable 'dhcp_interface2' from source: play vars 11000 1726867147.74764: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:07 -0400 (0:00:00.031) 0:00:09.391 ****** 11000 1726867147.74794: entering _queue_task() for managed_node1/stat 11000 1726867147.75012: worker is 1 (out of 1 available) 11000 1726867147.75022: exiting _queue_task() for managed_node1/stat 11000 1726867147.75033: done queuing things up, now waiting for results queue to drain 11000 1726867147.75034: waiting for pending results... 11000 1726867147.75266: running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 11000 1726867147.75396: in run() - task 0affcac9-a3a5-c734-026a-00000000016b 11000 1726867147.75415: variable 'ansible_search_path' from source: unknown 11000 1726867147.75423: variable 'ansible_search_path' from source: unknown 11000 1726867147.75459: calling self._execute() 11000 1726867147.75538: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.75548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.75558: variable 'omit' from source: magic vars 11000 1726867147.75874: variable 'ansible_distribution_major_version' from source: facts 11000 1726867147.75893: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867147.75925: variable 'omit' from source: magic vars 11000 1726867147.75959: variable 'omit' from source: magic vars 11000 1726867147.76048: variable 'interface' from source: task vars 11000 1726867147.76056: variable 'dhcp_interface2' from source: play vars 11000 1726867147.76142: variable 'dhcp_interface2' from source: play vars 11000 1726867147.76146: variable 'omit' from source: magic vars 11000 1726867147.76176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867147.76213: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867147.76252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867147.76685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867147.76688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867147.76691: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867147.76694: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.76696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.76698: Set connection var ansible_shell_type to sh 11000 1726867147.76700: Set connection var ansible_pipelining to False 11000 1726867147.76702: Set connection var ansible_shell_executable to /bin/sh 11000 1726867147.76704: Set connection var ansible_connection to ssh 11000 1726867147.76706: Set connection var ansible_timeout to 10 11000 1726867147.76708: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867147.76713: variable 'ansible_shell_executable' from source: unknown 11000 1726867147.76722: variable 'ansible_connection' from source: unknown 11000 1726867147.76729: variable 'ansible_module_compression' from source: unknown 11000 1726867147.76736: variable 'ansible_shell_type' from source: unknown 11000 1726867147.76743: variable 'ansible_shell_executable' from source: unknown 11000 1726867147.76750: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867147.76798: variable 'ansible_pipelining' from source: unknown 11000 1726867147.76806: variable 'ansible_timeout' from source: unknown 11000 1726867147.76813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867147.77163: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867147.77259: variable 'omit' from source: magic vars 11000 1726867147.77269: starting attempt loop 11000 1726867147.77296: running the handler 11000 1726867147.77314: _low_level_execute_command(): starting 11000 1726867147.77347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867147.78799: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.78874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.78934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.78945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.79005: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867147.79098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.79152: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.79225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.80878: stdout chunk (state=3): >>>/root <<< 11000 1726867147.81019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.81046: stderr chunk (state=3): >>><<< 11000 1726867147.81240: stdout chunk (state=3): >>><<< 11000 1726867147.81246: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.81248: _low_level_execute_command(): starting 11000 1726867147.81252: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824 `" && echo ansible-tmp-1726867147.8115175-11427-219078652426824="` echo /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824 `" ) && sleep 0' 11000 1726867147.82373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867147.82376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.82398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867147.82416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867147.82447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867147.82694: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.82725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.82728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.82857: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.84719: stdout chunk (state=3): >>>ansible-tmp-1726867147.8115175-11427-219078652426824=/root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824 <<< 11000 1726867147.84833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.84862: stderr chunk (state=3): >>><<< 11000 1726867147.85066: stdout chunk (state=3): >>><<< 11000 1726867147.85070: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867147.8115175-11427-219078652426824=/root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.85072: variable 'ansible_module_compression' from source: unknown 11000 1726867147.85159: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11000 1726867147.85252: variable 'ansible_facts' from source: unknown 11000 1726867147.85583: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/AnsiballZ_stat.py 11000 1726867147.85709: Sending initial data 11000 1726867147.85719: Sent initial data (153 bytes) 11000 1726867147.86804: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.86873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.86896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.86997: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.87058: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.88705: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867147.88761: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867147.88885: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpf2zsekn8 /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/AnsiballZ_stat.py <<< 11000 1726867147.88891: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/AnsiballZ_stat.py" <<< 11000 1726867147.89001: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpf2zsekn8" to remote "/root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/AnsiballZ_stat.py" <<< 11000 1726867147.90318: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.90366: stderr chunk (state=3): >>><<< 11000 1726867147.90394: stdout chunk (state=3): >>><<< 11000 1726867147.90461: done transferring module to remote 11000 1726867147.90568: _low_level_execute_command(): starting 11000 1726867147.90572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/ /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/AnsiballZ_stat.py && sleep 0' 11000 1726867147.91600: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.91604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867147.91732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.91735: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.91738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867147.91743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867147.91789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867147.91896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.91956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867147.93783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867147.93835: stderr chunk (state=3): >>><<< 11000 1726867147.93845: stdout chunk (state=3): >>><<< 11000 1726867147.94023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867147.94027: _low_level_execute_command(): starting 11000 1726867147.94030: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/AnsiballZ_stat.py && sleep 0' 11000 1726867147.95107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867147.95122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867147.95135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867147.95157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867147.95480: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867147.95591: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867147.95874: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867148.10914: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26759, "dev": 23, "nlink": 1, "atime": 1726867145.9040318, "mtime": 1726867145.9040318, "ctime": 1726867145.9040318, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11000 1726867148.12231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867148.12243: stdout chunk (state=3): >>><<< 11000 1726867148.12255: stderr chunk (state=3): >>><<< 11000 1726867148.12303: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26759, "dev": 23, "nlink": 1, "atime": 1726867145.9040318, "mtime": 1726867145.9040318, "ctime": 1726867145.9040318, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867148.12417: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867148.12665: _low_level_execute_command(): starting 11000 1726867148.12673: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867147.8115175-11427-219078652426824/ > /dev/null 2>&1 && sleep 0' 11000 1726867148.13753: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867148.13764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867148.13923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867148.14083: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867148.14129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867148.15960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867148.15969: stdout chunk (state=3): >>><<< 11000 1726867148.15983: stderr chunk (state=3): >>><<< 11000 1726867148.16007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867148.16019: handler run complete 11000 1726867148.16183: attempt loop complete, returning result 11000 1726867148.16186: _execute() done 11000 1726867148.16189: dumping result to json 11000 1726867148.16191: done dumping result, returning 11000 1726867148.16192: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 [0affcac9-a3a5-c734-026a-00000000016b] 11000 1726867148.16194: sending task result for task 0affcac9-a3a5-c734-026a-00000000016b ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726867145.9040318, "block_size": 4096, "blocks": 0, "ctime": 1726867145.9040318, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26759, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726867145.9040318, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11000 1726867148.16525: no more pending results, returning what we have 11000 1726867148.16529: results queue empty 11000 1726867148.16530: checking for any_errors_fatal 11000 1726867148.16531: done checking for any_errors_fatal 11000 1726867148.16532: checking for max_fail_percentage 11000 1726867148.16533: done checking for max_fail_percentage 11000 1726867148.16534: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.16535: done checking to see if all hosts have failed 11000 1726867148.16536: getting the remaining hosts for this loop 11000 1726867148.16537: done getting the remaining hosts for this loop 11000 1726867148.16541: getting the next task for host managed_node1 11000 1726867148.16549: done getting next task for host managed_node1 11000 1726867148.16552: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11000 1726867148.16555: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.16560: getting variables 11000 1726867148.16561: in VariableManager get_vars() 11000 1726867148.17209: Calling all_inventory to load vars for managed_node1 11000 1726867148.17212: Calling groups_inventory to load vars for managed_node1 11000 1726867148.17215: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.17227: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.17230: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.17234: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.17711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.18120: done with get_vars() 11000 1726867148.18131: done getting variables 11000 1726867148.18413: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867148.18530: variable 'interface' from source: task vars 11000 1726867148.18534: variable 'dhcp_interface2' from source: play vars 11000 1726867148.18774: variable 'dhcp_interface2' from source: play vars 11000 1726867148.18803: done sending task result for task 0affcac9-a3a5-c734-026a-00000000016b 11000 1726867148.18806: WORKER PROCESS EXITING TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:08 -0400 (0:00:00.440) 0:00:09.831 ****** 11000 1726867148.18818: entering _queue_task() for managed_node1/assert 11000 1726867148.19075: worker is 1 (out of 1 available) 11000 1726867148.19094: exiting _queue_task() for managed_node1/assert 11000 1726867148.19106: done queuing things up, now waiting for results queue to drain 11000 1726867148.19107: waiting for pending results... 11000 1726867148.19530: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' 11000 1726867148.19803: in run() - task 0affcac9-a3a5-c734-026a-00000000001c 11000 1726867148.19824: variable 'ansible_search_path' from source: unknown 11000 1726867148.19832: variable 'ansible_search_path' from source: unknown 11000 1726867148.19919: calling self._execute() 11000 1726867148.20006: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.20193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.20209: variable 'omit' from source: magic vars 11000 1726867148.21035: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.21054: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.21064: variable 'omit' from source: magic vars 11000 1726867148.21111: variable 'omit' from source: magic vars 11000 1726867148.21366: variable 'interface' from source: task vars 11000 1726867148.21375: variable 'dhcp_interface2' from source: play vars 11000 1726867148.21441: variable 'dhcp_interface2' from source: play vars 11000 1726867148.21462: variable 'omit' from source: magic vars 11000 1726867148.21782: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867148.21789: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867148.21791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867148.21793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867148.21982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867148.21985: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867148.21991: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.21993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.21995: Set connection var ansible_shell_type to sh 11000 1726867148.21997: Set connection var ansible_pipelining to False 11000 1726867148.21999: Set connection var ansible_shell_executable to /bin/sh 11000 1726867148.22001: Set connection var ansible_connection to ssh 11000 1726867148.22003: Set connection var ansible_timeout to 10 11000 1726867148.22005: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867148.22193: variable 'ansible_shell_executable' from source: unknown 11000 1726867148.22202: variable 'ansible_connection' from source: unknown 11000 1726867148.22211: variable 'ansible_module_compression' from source: unknown 11000 1726867148.22218: variable 'ansible_shell_type' from source: unknown 11000 1726867148.22225: variable 'ansible_shell_executable' from source: unknown 11000 1726867148.22233: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.22240: variable 'ansible_pipelining' from source: unknown 11000 1726867148.22250: variable 'ansible_timeout' from source: unknown 11000 1726867148.22258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.22605: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867148.22621: variable 'omit' from source: magic vars 11000 1726867148.22631: starting attempt loop 11000 1726867148.22638: running the handler 11000 1726867148.22770: variable 'interface_stat' from source: set_fact 11000 1726867148.23182: Evaluated conditional (interface_stat.stat.exists): True 11000 1726867148.23188: handler run complete 11000 1726867148.23191: attempt loop complete, returning result 11000 1726867148.23193: _execute() done 11000 1726867148.23196: dumping result to json 11000 1726867148.23198: done dumping result, returning 11000 1726867148.23200: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' [0affcac9-a3a5-c734-026a-00000000001c] 11000 1726867148.23203: sending task result for task 0affcac9-a3a5-c734-026a-00000000001c 11000 1726867148.23272: done sending task result for task 0affcac9-a3a5-c734-026a-00000000001c 11000 1726867148.23275: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867148.23421: no more pending results, returning what we have 11000 1726867148.23425: results queue empty 11000 1726867148.23426: checking for any_errors_fatal 11000 1726867148.23433: done checking for any_errors_fatal 11000 1726867148.23433: checking for max_fail_percentage 11000 1726867148.23435: done checking for max_fail_percentage 11000 1726867148.23436: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.23437: done checking to see if all hosts have failed 11000 1726867148.23437: getting the remaining hosts for this loop 11000 1726867148.23438: done getting the remaining hosts for this loop 11000 1726867148.23441: getting the next task for host managed_node1 11000 1726867148.23448: done getting next task for host managed_node1 11000 1726867148.23450: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11000 1726867148.23452: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.23454: getting variables 11000 1726867148.23456: in VariableManager get_vars() 11000 1726867148.23497: Calling all_inventory to load vars for managed_node1 11000 1726867148.23499: Calling groups_inventory to load vars for managed_node1 11000 1726867148.23502: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.23512: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.23514: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.23516: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.23945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.24411: done with get_vars() 11000 1726867148.24422: done getting variables 11000 1726867148.24783: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:28 Friday 20 September 2024 17:19:08 -0400 (0:00:00.059) 0:00:09.891 ****** 11000 1726867148.24810: entering _queue_task() for managed_node1/command 11000 1726867148.25253: worker is 1 (out of 1 available) 11000 1726867148.25266: exiting _queue_task() for managed_node1/command 11000 1726867148.25282: done queuing things up, now waiting for results queue to drain 11000 1726867148.25284: waiting for pending results... 11000 1726867148.25740: running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript 11000 1726867148.25954: in run() - task 0affcac9-a3a5-c734-026a-00000000001d 11000 1726867148.25976: variable 'ansible_search_path' from source: unknown 11000 1726867148.26091: calling self._execute() 11000 1726867148.26297: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.26309: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.26323: variable 'omit' from source: magic vars 11000 1726867148.27116: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.27134: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.27383: variable 'network_provider' from source: set_fact 11000 1726867148.27390: Evaluated conditional (network_provider == "initscripts"): False 11000 1726867148.27393: when evaluation is False, skipping this task 11000 1726867148.27396: _execute() done 11000 1726867148.27398: dumping result to json 11000 1726867148.27400: done dumping result, returning 11000 1726867148.27403: done running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript [0affcac9-a3a5-c734-026a-00000000001d] 11000 1726867148.27405: sending task result for task 0affcac9-a3a5-c734-026a-00000000001d 11000 1726867148.27723: done sending task result for task 0affcac9-a3a5-c734-026a-00000000001d 11000 1726867148.27726: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11000 1726867148.27807: no more pending results, returning what we have 11000 1726867148.27810: results queue empty 11000 1726867148.27811: checking for any_errors_fatal 11000 1726867148.27817: done checking for any_errors_fatal 11000 1726867148.27818: checking for max_fail_percentage 11000 1726867148.27819: done checking for max_fail_percentage 11000 1726867148.27820: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.27821: done checking to see if all hosts have failed 11000 1726867148.27822: getting the remaining hosts for this loop 11000 1726867148.27823: done getting the remaining hosts for this loop 11000 1726867148.27826: getting the next task for host managed_node1 11000 1726867148.27831: done getting next task for host managed_node1 11000 1726867148.27833: ^ task is: TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 11000 1726867148.27835: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.27837: getting variables 11000 1726867148.27839: in VariableManager get_vars() 11000 1726867148.27875: Calling all_inventory to load vars for managed_node1 11000 1726867148.27879: Calling groups_inventory to load vars for managed_node1 11000 1726867148.27882: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.27892: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.27894: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.27898: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.28283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.28741: done with get_vars() 11000 1726867148.28751: done getting variables 11000 1726867148.28924: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports using deprecated 'master' argument] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:33 Friday 20 September 2024 17:19:08 -0400 (0:00:00.041) 0:00:09.932 ****** 11000 1726867148.28949: entering _queue_task() for managed_node1/debug 11000 1726867148.29488: worker is 1 (out of 1 available) 11000 1726867148.29500: exiting _queue_task() for managed_node1/debug 11000 1726867148.29512: done queuing things up, now waiting for results queue to drain 11000 1726867148.29513: waiting for pending results... 11000 1726867148.29820: running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 11000 1726867148.30050: in run() - task 0affcac9-a3a5-c734-026a-00000000001e 11000 1726867148.30102: variable 'ansible_search_path' from source: unknown 11000 1726867148.30312: calling self._execute() 11000 1726867148.30388: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.30462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.30479: variable 'omit' from source: magic vars 11000 1726867148.31574: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.31579: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.31581: variable 'omit' from source: magic vars 11000 1726867148.31583: variable 'omit' from source: magic vars 11000 1726867148.31585: variable 'omit' from source: magic vars 11000 1726867148.31692: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867148.31736: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867148.31765: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867148.31852: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867148.31871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867148.31935: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867148.32000: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.32015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.32194: Set connection var ansible_shell_type to sh 11000 1726867148.32585: Set connection var ansible_pipelining to False 11000 1726867148.32591: Set connection var ansible_shell_executable to /bin/sh 11000 1726867148.32594: Set connection var ansible_connection to ssh 11000 1726867148.32596: Set connection var ansible_timeout to 10 11000 1726867148.32598: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867148.32600: variable 'ansible_shell_executable' from source: unknown 11000 1726867148.32602: variable 'ansible_connection' from source: unknown 11000 1726867148.32604: variable 'ansible_module_compression' from source: unknown 11000 1726867148.32606: variable 'ansible_shell_type' from source: unknown 11000 1726867148.32608: variable 'ansible_shell_executable' from source: unknown 11000 1726867148.32611: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.32613: variable 'ansible_pipelining' from source: unknown 11000 1726867148.32615: variable 'ansible_timeout' from source: unknown 11000 1726867148.32617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.32716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867148.32791: variable 'omit' from source: magic vars 11000 1726867148.32807: starting attempt loop 11000 1726867148.32814: running the handler 11000 1726867148.32982: handler run complete 11000 1726867148.32989: attempt loop complete, returning result 11000 1726867148.32992: _execute() done 11000 1726867148.32994: dumping result to json 11000 1726867148.32996: done dumping result, returning 11000 1726867148.32998: done running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument [0affcac9-a3a5-c734-026a-00000000001e] 11000 1726867148.33000: sending task result for task 0affcac9-a3a5-c734-026a-00000000001e ok: [managed_node1] => {} MSG: ################################################## 11000 1726867148.33206: no more pending results, returning what we have 11000 1726867148.33209: results queue empty 11000 1726867148.33210: checking for any_errors_fatal 11000 1726867148.33215: done checking for any_errors_fatal 11000 1726867148.33215: checking for max_fail_percentage 11000 1726867148.33217: done checking for max_fail_percentage 11000 1726867148.33218: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.33218: done checking to see if all hosts have failed 11000 1726867148.33219: getting the remaining hosts for this loop 11000 1726867148.33220: done getting the remaining hosts for this loop 11000 1726867148.33223: getting the next task for host managed_node1 11000 1726867148.33230: done getting next task for host managed_node1 11000 1726867148.33236: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11000 1726867148.33238: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.33252: getting variables 11000 1726867148.33253: in VariableManager get_vars() 11000 1726867148.33293: Calling all_inventory to load vars for managed_node1 11000 1726867148.33296: Calling groups_inventory to load vars for managed_node1 11000 1726867148.33299: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.33309: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.33312: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.33315: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.33840: done sending task result for task 0affcac9-a3a5-c734-026a-00000000001e 11000 1726867148.33844: WORKER PROCESS EXITING 11000 1726867148.33857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.34224: done with get_vars() 11000 1726867148.34233: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:19:08 -0400 (0:00:00.053) 0:00:09.986 ****** 11000 1726867148.34315: entering _queue_task() for managed_node1/include_tasks 11000 1726867148.34809: worker is 1 (out of 1 available) 11000 1726867148.34824: exiting _queue_task() for managed_node1/include_tasks 11000 1726867148.34836: done queuing things up, now waiting for results queue to drain 11000 1726867148.34837: waiting for pending results... 11000 1726867148.35335: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11000 1726867148.35571: in run() - task 0affcac9-a3a5-c734-026a-000000000026 11000 1726867148.35594: variable 'ansible_search_path' from source: unknown 11000 1726867148.35783: variable 'ansible_search_path' from source: unknown 11000 1726867148.35786: calling self._execute() 11000 1726867148.35869: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.35884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.35898: variable 'omit' from source: magic vars 11000 1726867148.36687: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.36690: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.36693: _execute() done 11000 1726867148.36696: dumping result to json 11000 1726867148.36698: done dumping result, returning 11000 1726867148.36701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-c734-026a-000000000026] 11000 1726867148.36703: sending task result for task 0affcac9-a3a5-c734-026a-000000000026 11000 1726867148.36991: no more pending results, returning what we have 11000 1726867148.36995: in VariableManager get_vars() 11000 1726867148.37036: Calling all_inventory to load vars for managed_node1 11000 1726867148.37039: Calling groups_inventory to load vars for managed_node1 11000 1726867148.37041: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.37052: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.37055: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.37059: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.37435: done sending task result for task 0affcac9-a3a5-c734-026a-000000000026 11000 1726867148.37441: WORKER PROCESS EXITING 11000 1726867148.37463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.37858: done with get_vars() 11000 1726867148.37866: variable 'ansible_search_path' from source: unknown 11000 1726867148.37867: variable 'ansible_search_path' from source: unknown 11000 1726867148.37905: we have included files to process 11000 1726867148.37906: generating all_blocks data 11000 1726867148.37908: done generating all_blocks data 11000 1726867148.37912: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11000 1726867148.37913: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11000 1726867148.37914: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11000 1726867148.39271: done processing included file 11000 1726867148.39273: iterating over new_blocks loaded from include file 11000 1726867148.39275: in VariableManager get_vars() 11000 1726867148.39510: done with get_vars() 11000 1726867148.39512: filtering new block on tags 11000 1726867148.39530: done filtering new block on tags 11000 1726867148.39533: in VariableManager get_vars() 11000 1726867148.39556: done with get_vars() 11000 1726867148.39558: filtering new block on tags 11000 1726867148.39581: done filtering new block on tags 11000 1726867148.39584: in VariableManager get_vars() 11000 1726867148.39610: done with get_vars() 11000 1726867148.39611: filtering new block on tags 11000 1726867148.39629: done filtering new block on tags 11000 1726867148.39631: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11000 1726867148.39636: extending task lists for all hosts with included blocks 11000 1726867148.41448: done extending task lists 11000 1726867148.41449: done processing included files 11000 1726867148.41450: results queue empty 11000 1726867148.41451: checking for any_errors_fatal 11000 1726867148.41453: done checking for any_errors_fatal 11000 1726867148.41454: checking for max_fail_percentage 11000 1726867148.41455: done checking for max_fail_percentage 11000 1726867148.41456: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.41456: done checking to see if all hosts have failed 11000 1726867148.41457: getting the remaining hosts for this loop 11000 1726867148.41458: done getting the remaining hosts for this loop 11000 1726867148.41460: getting the next task for host managed_node1 11000 1726867148.41464: done getting next task for host managed_node1 11000 1726867148.41466: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11000 1726867148.41469: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.41680: getting variables 11000 1726867148.41681: in VariableManager get_vars() 11000 1726867148.41699: Calling all_inventory to load vars for managed_node1 11000 1726867148.41701: Calling groups_inventory to load vars for managed_node1 11000 1726867148.41702: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.41707: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.41709: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.41712: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.41871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.42276: done with get_vars() 11000 1726867148.42491: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:19:08 -0400 (0:00:00.082) 0:00:10.068 ****** 11000 1726867148.42564: entering _queue_task() for managed_node1/setup 11000 1726867148.43068: worker is 1 (out of 1 available) 11000 1726867148.43283: exiting _queue_task() for managed_node1/setup 11000 1726867148.43299: done queuing things up, now waiting for results queue to drain 11000 1726867148.43300: waiting for pending results... 11000 1726867148.43893: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11000 1726867148.44224: in run() - task 0affcac9-a3a5-c734-026a-000000000189 11000 1726867148.44281: variable 'ansible_search_path' from source: unknown 11000 1726867148.44293: variable 'ansible_search_path' from source: unknown 11000 1726867148.44395: calling self._execute() 11000 1726867148.44445: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.44456: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.44469: variable 'omit' from source: magic vars 11000 1726867148.44865: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.44886: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.45122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867148.48125: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867148.48222: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867148.48234: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867148.48266: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867148.48306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867148.48394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867148.48440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867148.48549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867148.48553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867148.48556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867148.48596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867148.48625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867148.48662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867148.48709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867148.48728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867148.48897: variable '__network_required_facts' from source: role '' defaults 11000 1726867148.48914: variable 'ansible_facts' from source: unknown 11000 1726867148.49009: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11000 1726867148.49018: when evaluation is False, skipping this task 11000 1726867148.49026: _execute() done 11000 1726867148.49033: dumping result to json 11000 1726867148.49040: done dumping result, returning 11000 1726867148.49052: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-c734-026a-000000000189] 11000 1726867148.49061: sending task result for task 0affcac9-a3a5-c734-026a-000000000189 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867148.49297: no more pending results, returning what we have 11000 1726867148.49301: results queue empty 11000 1726867148.49304: checking for any_errors_fatal 11000 1726867148.49306: done checking for any_errors_fatal 11000 1726867148.49307: checking for max_fail_percentage 11000 1726867148.49309: done checking for max_fail_percentage 11000 1726867148.49309: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.49310: done checking to see if all hosts have failed 11000 1726867148.49311: getting the remaining hosts for this loop 11000 1726867148.49312: done getting the remaining hosts for this loop 11000 1726867148.49316: getting the next task for host managed_node1 11000 1726867148.49326: done getting next task for host managed_node1 11000 1726867148.49330: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11000 1726867148.49334: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.49347: getting variables 11000 1726867148.49349: in VariableManager get_vars() 11000 1726867148.49394: Calling all_inventory to load vars for managed_node1 11000 1726867148.49397: Calling groups_inventory to load vars for managed_node1 11000 1726867148.49400: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.49410: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.49414: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.49417: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.49826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.50215: done with get_vars() 11000 1726867148.50225: done getting variables 11000 1726867148.50253: done sending task result for task 0affcac9-a3a5-c734-026a-000000000189 11000 1726867148.50257: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:19:08 -0400 (0:00:00.077) 0:00:10.146 ****** 11000 1726867148.50336: entering _queue_task() for managed_node1/stat 11000 1726867148.50687: worker is 1 (out of 1 available) 11000 1726867148.50705: exiting _queue_task() for managed_node1/stat 11000 1726867148.50715: done queuing things up, now waiting for results queue to drain 11000 1726867148.50716: waiting for pending results... 11000 1726867148.50889: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11000 1726867148.51040: in run() - task 0affcac9-a3a5-c734-026a-00000000018b 11000 1726867148.51059: variable 'ansible_search_path' from source: unknown 11000 1726867148.51085: variable 'ansible_search_path' from source: unknown 11000 1726867148.51385: calling self._execute() 11000 1726867148.51388: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.51391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.51394: variable 'omit' from source: magic vars 11000 1726867148.52068: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.52115: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.52549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867148.52856: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867148.52913: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867148.52951: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867148.52995: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867148.53086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867148.53117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867148.53153: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867148.53190: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867148.53289: variable '__network_is_ostree' from source: set_fact 11000 1726867148.53308: Evaluated conditional (not __network_is_ostree is defined): False 11000 1726867148.53351: when evaluation is False, skipping this task 11000 1726867148.53354: _execute() done 11000 1726867148.53357: dumping result to json 11000 1726867148.53359: done dumping result, returning 11000 1726867148.53361: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-c734-026a-00000000018b] 11000 1726867148.53364: sending task result for task 0affcac9-a3a5-c734-026a-00000000018b skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11000 1726867148.53546: no more pending results, returning what we have 11000 1726867148.53549: results queue empty 11000 1726867148.53550: checking for any_errors_fatal 11000 1726867148.53556: done checking for any_errors_fatal 11000 1726867148.53557: checking for max_fail_percentage 11000 1726867148.53558: done checking for max_fail_percentage 11000 1726867148.53559: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.53560: done checking to see if all hosts have failed 11000 1726867148.53561: getting the remaining hosts for this loop 11000 1726867148.53568: done getting the remaining hosts for this loop 11000 1726867148.53572: getting the next task for host managed_node1 11000 1726867148.53581: done getting next task for host managed_node1 11000 1726867148.53585: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11000 1726867148.53589: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.53603: getting variables 11000 1726867148.53605: in VariableManager get_vars() 11000 1726867148.53646: Calling all_inventory to load vars for managed_node1 11000 1726867148.53649: Calling groups_inventory to load vars for managed_node1 11000 1726867148.53652: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.53663: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.53666: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.53669: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.53793: done sending task result for task 0affcac9-a3a5-c734-026a-00000000018b 11000 1726867148.53797: WORKER PROCESS EXITING 11000 1726867148.54049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.54385: done with get_vars() 11000 1726867148.54485: done getting variables 11000 1726867148.54533: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:19:08 -0400 (0:00:00.043) 0:00:10.189 ****** 11000 1726867148.54680: entering _queue_task() for managed_node1/set_fact 11000 1726867148.55116: worker is 1 (out of 1 available) 11000 1726867148.55127: exiting _queue_task() for managed_node1/set_fact 11000 1726867148.55137: done queuing things up, now waiting for results queue to drain 11000 1726867148.55138: waiting for pending results... 11000 1726867148.55458: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11000 1726867148.55810: in run() - task 0affcac9-a3a5-c734-026a-00000000018c 11000 1726867148.55830: variable 'ansible_search_path' from source: unknown 11000 1726867148.55838: variable 'ansible_search_path' from source: unknown 11000 1726867148.55880: calling self._execute() 11000 1726867148.55957: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.55970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.55990: variable 'omit' from source: magic vars 11000 1726867148.56468: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.56533: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.56855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867148.57564: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867148.57617: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867148.57830: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867148.57982: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867148.57988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867148.58045: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867148.58098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867148.58130: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867148.58220: variable '__network_is_ostree' from source: set_fact 11000 1726867148.58232: Evaluated conditional (not __network_is_ostree is defined): False 11000 1726867148.58240: when evaluation is False, skipping this task 11000 1726867148.58246: _execute() done 11000 1726867148.58254: dumping result to json 11000 1726867148.58262: done dumping result, returning 11000 1726867148.58273: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-c734-026a-00000000018c] 11000 1726867148.58284: sending task result for task 0affcac9-a3a5-c734-026a-00000000018c skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11000 1726867148.58413: no more pending results, returning what we have 11000 1726867148.58416: results queue empty 11000 1726867148.58417: checking for any_errors_fatal 11000 1726867148.58421: done checking for any_errors_fatal 11000 1726867148.58421: checking for max_fail_percentage 11000 1726867148.58423: done checking for max_fail_percentage 11000 1726867148.58424: checking to see if all hosts have failed and the running result is not ok 11000 1726867148.58424: done checking to see if all hosts have failed 11000 1726867148.58425: getting the remaining hosts for this loop 11000 1726867148.58426: done getting the remaining hosts for this loop 11000 1726867148.58429: getting the next task for host managed_node1 11000 1726867148.58439: done getting next task for host managed_node1 11000 1726867148.58442: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11000 1726867148.58445: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867148.58457: getting variables 11000 1726867148.58458: in VariableManager get_vars() 11000 1726867148.58600: Calling all_inventory to load vars for managed_node1 11000 1726867148.58603: Calling groups_inventory to load vars for managed_node1 11000 1726867148.58605: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867148.58613: done sending task result for task 0affcac9-a3a5-c734-026a-00000000018c 11000 1726867148.58615: WORKER PROCESS EXITING 11000 1726867148.58622: Calling all_plugins_play to load vars for managed_node1 11000 1726867148.58625: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867148.58627: Calling groups_plugins_play to load vars for managed_node1 11000 1726867148.58920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867148.59138: done with get_vars() 11000 1726867148.59147: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:19:08 -0400 (0:00:00.045) 0:00:10.235 ****** 11000 1726867148.59247: entering _queue_task() for managed_node1/service_facts 11000 1726867148.59248: Creating lock for service_facts 11000 1726867148.59504: worker is 1 (out of 1 available) 11000 1726867148.59518: exiting _queue_task() for managed_node1/service_facts 11000 1726867148.59534: done queuing things up, now waiting for results queue to drain 11000 1726867148.59536: waiting for pending results... 11000 1726867148.59782: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11000 1726867148.59926: in run() - task 0affcac9-a3a5-c734-026a-00000000018e 11000 1726867148.59947: variable 'ansible_search_path' from source: unknown 11000 1726867148.59955: variable 'ansible_search_path' from source: unknown 11000 1726867148.60001: calling self._execute() 11000 1726867148.60100: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.60113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.60126: variable 'omit' from source: magic vars 11000 1726867148.60495: variable 'ansible_distribution_major_version' from source: facts 11000 1726867148.60518: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867148.60529: variable 'omit' from source: magic vars 11000 1726867148.60600: variable 'omit' from source: magic vars 11000 1726867148.60644: variable 'omit' from source: magic vars 11000 1726867148.60731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867148.60734: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867148.60751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867148.60772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867148.60794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867148.60827: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867148.60845: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.60854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.60957: Set connection var ansible_shell_type to sh 11000 1726867148.60983: Set connection var ansible_pipelining to False 11000 1726867148.60988: Set connection var ansible_shell_executable to /bin/sh 11000 1726867148.60991: Set connection var ansible_connection to ssh 11000 1726867148.61058: Set connection var ansible_timeout to 10 11000 1726867148.61061: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867148.61063: variable 'ansible_shell_executable' from source: unknown 11000 1726867148.61065: variable 'ansible_connection' from source: unknown 11000 1726867148.61067: variable 'ansible_module_compression' from source: unknown 11000 1726867148.61069: variable 'ansible_shell_type' from source: unknown 11000 1726867148.61071: variable 'ansible_shell_executable' from source: unknown 11000 1726867148.61073: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867148.61074: variable 'ansible_pipelining' from source: unknown 11000 1726867148.61076: variable 'ansible_timeout' from source: unknown 11000 1726867148.61091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867148.61296: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867148.61313: variable 'omit' from source: magic vars 11000 1726867148.61323: starting attempt loop 11000 1726867148.61389: running the handler 11000 1726867148.61393: _low_level_execute_command(): starting 11000 1726867148.61395: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867148.62072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867148.62093: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867148.62158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867148.62210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867148.62226: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867148.62240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867148.62320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867148.63993: stdout chunk (state=3): >>>/root <<< 11000 1726867148.64151: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867148.64154: stdout chunk (state=3): >>><<< 11000 1726867148.64156: stderr chunk (state=3): >>><<< 11000 1726867148.64173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867148.64256: _low_level_execute_command(): starting 11000 1726867148.64262: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929 `" && echo ansible-tmp-1726867148.6418176-11466-259937225703929="` echo /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929 `" ) && sleep 0' 11000 1726867148.64820: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867148.64841: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867148.64895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867148.64970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867148.64995: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867148.65018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867148.65096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867148.66962: stdout chunk (state=3): >>>ansible-tmp-1726867148.6418176-11466-259937225703929=/root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929 <<< 11000 1726867148.67173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867148.67179: stdout chunk (state=3): >>><<< 11000 1726867148.67182: stderr chunk (state=3): >>><<< 11000 1726867148.67199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867148.6418176-11466-259937225703929=/root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867148.67289: variable 'ansible_module_compression' from source: unknown 11000 1726867148.67298: ANSIBALLZ: Using lock for service_facts 11000 1726867148.67305: ANSIBALLZ: Acquiring lock 11000 1726867148.67312: ANSIBALLZ: Lock acquired: 139984827081088 11000 1726867148.67320: ANSIBALLZ: Creating module 11000 1726867148.81697: ANSIBALLZ: Writing module into payload 11000 1726867148.81789: ANSIBALLZ: Writing module 11000 1726867148.81819: ANSIBALLZ: Renaming module 11000 1726867148.81825: ANSIBALLZ: Done creating module 11000 1726867148.81844: variable 'ansible_facts' from source: unknown 11000 1726867148.81921: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/AnsiballZ_service_facts.py 11000 1726867148.82185: Sending initial data 11000 1726867148.82188: Sent initial data (162 bytes) 11000 1726867148.82679: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867148.82694: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867148.82703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867148.82718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867148.82912: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867148.82915: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867148.82917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867148.82984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867148.83066: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867148.84729: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11000 1726867148.84733: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867148.84775: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867148.84850: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpd42dc81n /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/AnsiballZ_service_facts.py <<< 11000 1726867148.84858: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/AnsiballZ_service_facts.py" <<< 11000 1726867148.84903: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpd42dc81n" to remote "/root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/AnsiballZ_service_facts.py" <<< 11000 1726867148.85625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867148.85739: stdout chunk (state=3): >>><<< 11000 1726867148.85742: stderr chunk (state=3): >>><<< 11000 1726867148.85752: done transferring module to remote 11000 1726867148.85766: _low_level_execute_command(): starting 11000 1726867148.85775: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/ /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/AnsiballZ_service_facts.py && sleep 0' 11000 1726867148.86495: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867148.86501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867148.86525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867148.86544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867148.86627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867148.88537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867148.88541: stdout chunk (state=3): >>><<< 11000 1726867148.88544: stderr chunk (state=3): >>><<< 11000 1726867148.88547: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867148.88549: _low_level_execute_command(): starting 11000 1726867148.88552: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/AnsiballZ_service_facts.py && sleep 0' 11000 1726867148.89072: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867148.89096: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867148.89108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867148.89123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867148.89137: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867148.89146: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867148.89156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867148.89245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867148.89266: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867148.89345: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867150.46751: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 11000 1726867150.46764: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.s<<< 11000 1726867150.46775: stdout chunk (state=3): >>>ervice", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "stat<<< 11000 1726867150.46796: stdout chunk (state=3): >>>us": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state":<<< 11000 1726867150.46810: stdout chunk (state=3): >>> "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11000 1726867150.48244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867150.48275: stderr chunk (state=3): >>><<< 11000 1726867150.48280: stdout chunk (state=3): >>><<< 11000 1726867150.48305: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867150.49501: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867150.49512: _low_level_execute_command(): starting 11000 1726867150.49515: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867148.6418176-11466-259937225703929/ > /dev/null 2>&1 && sleep 0' 11000 1726867150.50036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867150.50040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867150.50083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867150.50109: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867150.50168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867150.51942: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867150.51964: stderr chunk (state=3): >>><<< 11000 1726867150.51967: stdout chunk (state=3): >>><<< 11000 1726867150.51981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867150.51989: handler run complete 11000 1726867150.52095: variable 'ansible_facts' from source: unknown 11000 1726867150.52205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867150.52459: variable 'ansible_facts' from source: unknown 11000 1726867150.52982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867150.52985: attempt loop complete, returning result 11000 1726867150.52990: _execute() done 11000 1726867150.52992: dumping result to json 11000 1726867150.52993: done dumping result, returning 11000 1726867150.52995: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-c734-026a-00000000018e] 11000 1726867150.52997: sending task result for task 0affcac9-a3a5-c734-026a-00000000018e ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867150.53721: no more pending results, returning what we have 11000 1726867150.53724: results queue empty 11000 1726867150.53725: checking for any_errors_fatal 11000 1726867150.53728: done checking for any_errors_fatal 11000 1726867150.53729: checking for max_fail_percentage 11000 1726867150.53731: done checking for max_fail_percentage 11000 1726867150.53732: checking to see if all hosts have failed and the running result is not ok 11000 1726867150.53732: done checking to see if all hosts have failed 11000 1726867150.53733: getting the remaining hosts for this loop 11000 1726867150.53734: done getting the remaining hosts for this loop 11000 1726867150.53737: getting the next task for host managed_node1 11000 1726867150.53742: done getting next task for host managed_node1 11000 1726867150.53745: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11000 1726867150.53749: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867150.53758: getting variables 11000 1726867150.53759: in VariableManager get_vars() 11000 1726867150.53796: Calling all_inventory to load vars for managed_node1 11000 1726867150.53799: Calling groups_inventory to load vars for managed_node1 11000 1726867150.53802: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867150.53811: Calling all_plugins_play to load vars for managed_node1 11000 1726867150.53814: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867150.53817: Calling groups_plugins_play to load vars for managed_node1 11000 1726867150.54392: done sending task result for task 0affcac9-a3a5-c734-026a-00000000018e 11000 1726867150.54396: WORKER PROCESS EXITING 11000 1726867150.54456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867150.54948: done with get_vars() 11000 1726867150.54959: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:19:10 -0400 (0:00:01.958) 0:00:12.193 ****** 11000 1726867150.55052: entering _queue_task() for managed_node1/package_facts 11000 1726867150.55053: Creating lock for package_facts 11000 1726867150.55317: worker is 1 (out of 1 available) 11000 1726867150.55328: exiting _queue_task() for managed_node1/package_facts 11000 1726867150.55340: done queuing things up, now waiting for results queue to drain 11000 1726867150.55341: waiting for pending results... 11000 1726867150.55592: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11000 1726867150.55783: in run() - task 0affcac9-a3a5-c734-026a-00000000018f 11000 1726867150.55791: variable 'ansible_search_path' from source: unknown 11000 1726867150.55794: variable 'ansible_search_path' from source: unknown 11000 1726867150.55803: calling self._execute() 11000 1726867150.55909: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867150.55913: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867150.55915: variable 'omit' from source: magic vars 11000 1726867150.56268: variable 'ansible_distribution_major_version' from source: facts 11000 1726867150.56287: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867150.56299: variable 'omit' from source: magic vars 11000 1726867150.56370: variable 'omit' from source: magic vars 11000 1726867150.56451: variable 'omit' from source: magic vars 11000 1726867150.56469: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867150.56511: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867150.56534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867150.56557: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867150.56582: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867150.56782: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867150.56785: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867150.56788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867150.56790: Set connection var ansible_shell_type to sh 11000 1726867150.56792: Set connection var ansible_pipelining to False 11000 1726867150.56794: Set connection var ansible_shell_executable to /bin/sh 11000 1726867150.56796: Set connection var ansible_connection to ssh 11000 1726867150.56798: Set connection var ansible_timeout to 10 11000 1726867150.56800: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867150.56802: variable 'ansible_shell_executable' from source: unknown 11000 1726867150.56804: variable 'ansible_connection' from source: unknown 11000 1726867150.56806: variable 'ansible_module_compression' from source: unknown 11000 1726867150.56808: variable 'ansible_shell_type' from source: unknown 11000 1726867150.56810: variable 'ansible_shell_executable' from source: unknown 11000 1726867150.56812: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867150.56814: variable 'ansible_pipelining' from source: unknown 11000 1726867150.56816: variable 'ansible_timeout' from source: unknown 11000 1726867150.56818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867150.57013: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867150.57028: variable 'omit' from source: magic vars 11000 1726867150.57042: starting attempt loop 11000 1726867150.57050: running the handler 11000 1726867150.57066: _low_level_execute_command(): starting 11000 1726867150.57080: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867150.58703: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867150.58716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867150.58732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867150.58808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867150.60400: stdout chunk (state=3): >>>/root <<< 11000 1726867150.60543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867150.60569: stdout chunk (state=3): >>><<< 11000 1726867150.60945: stderr chunk (state=3): >>><<< 11000 1726867150.60950: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867150.60952: _low_level_execute_command(): starting 11000 1726867150.60956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156 `" && echo ansible-tmp-1726867150.608028-11566-52720653420156="` echo /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156 `" ) && sleep 0' 11000 1726867150.61558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867150.61591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867150.61609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867150.61710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867150.63587: stdout chunk (state=3): >>>ansible-tmp-1726867150.608028-11566-52720653420156=/root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156 <<< 11000 1726867150.63727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867150.63737: stdout chunk (state=3): >>><<< 11000 1726867150.63751: stderr chunk (state=3): >>><<< 11000 1726867150.63774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867150.608028-11566-52720653420156=/root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867150.63833: variable 'ansible_module_compression' from source: unknown 11000 1726867150.63901: ANSIBALLZ: Using lock for package_facts 11000 1726867150.63907: ANSIBALLZ: Acquiring lock 11000 1726867150.63914: ANSIBALLZ: Lock acquired: 139984828513888 11000 1726867150.63921: ANSIBALLZ: Creating module 11000 1726867150.88643: ANSIBALLZ: Writing module into payload 11000 1726867150.88790: ANSIBALLZ: Writing module 11000 1726867150.88991: ANSIBALLZ: Renaming module 11000 1726867150.88994: ANSIBALLZ: Done creating module 11000 1726867150.88999: variable 'ansible_facts' from source: unknown 11000 1726867150.89065: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/AnsiballZ_package_facts.py 11000 1726867150.89223: Sending initial data 11000 1726867150.89334: Sent initial data (160 bytes) 11000 1726867150.89908: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867150.90012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867150.90057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867150.90072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867150.90100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867150.90173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867150.91821: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867150.91990: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867150.91994: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmp2uz8vctm /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/AnsiballZ_package_facts.py <<< 11000 1726867150.91996: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/AnsiballZ_package_facts.py" <<< 11000 1726867150.92102: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmp2uz8vctm" to remote "/root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/AnsiballZ_package_facts.py" <<< 11000 1726867150.94980: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867150.95053: stderr chunk (state=3): >>><<< 11000 1726867150.95151: stdout chunk (state=3): >>><<< 11000 1726867150.95154: done transferring module to remote 11000 1726867150.95156: _low_level_execute_command(): starting 11000 1726867150.95159: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/ /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/AnsiballZ_package_facts.py && sleep 0' 11000 1726867150.95721: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867150.95738: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867150.95754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867150.95778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867150.95817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867150.95837: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867150.95875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867150.95972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867150.96075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867150.96102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867150.96294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867150.98212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867150.98215: stdout chunk (state=3): >>><<< 11000 1726867150.98218: stderr chunk (state=3): >>><<< 11000 1726867150.98221: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867150.98223: _low_level_execute_command(): starting 11000 1726867150.98226: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/AnsiballZ_package_facts.py && sleep 0' 11000 1726867150.99486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867150.99489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867150.99492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867150.99494: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867150.99508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867150.99569: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867150.99645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867150.99882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867151.44142: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11000 1726867151.44150: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "rele<<< 11000 1726867151.44164: stdout chunk (state=3): >>>ase": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11000 1726867151.44200: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 11000 1726867151.44340: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11000 1726867151.44345: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11000 1726867151.44355: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11000 1726867151.44358: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11000 1726867151.46376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867151.46382: stdout chunk (state=3): >>><<< 11000 1726867151.46384: stderr chunk (state=3): >>><<< 11000 1726867151.46469: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867151.52415: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867151.52520: _low_level_execute_command(): starting 11000 1726867151.52524: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867150.608028-11566-52720653420156/ > /dev/null 2>&1 && sleep 0' 11000 1726867151.53733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867151.53797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867151.53969: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867151.54210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867151.54302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867151.56288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867151.56299: stdout chunk (state=3): >>><<< 11000 1726867151.56310: stderr chunk (state=3): >>><<< 11000 1726867151.56362: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867151.56374: handler run complete 11000 1726867151.58585: variable 'ansible_facts' from source: unknown 11000 1726867151.59509: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867151.63551: variable 'ansible_facts' from source: unknown 11000 1726867151.64335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867151.65747: attempt loop complete, returning result 11000 1726867151.65800: _execute() done 11000 1726867151.65926: dumping result to json 11000 1726867151.66271: done dumping result, returning 11000 1726867151.66290: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-c734-026a-00000000018f] 11000 1726867151.66472: sending task result for task 0affcac9-a3a5-c734-026a-00000000018f 11000 1726867151.70561: done sending task result for task 0affcac9-a3a5-c734-026a-00000000018f 11000 1726867151.70564: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867151.70662: no more pending results, returning what we have 11000 1726867151.70664: results queue empty 11000 1726867151.70665: checking for any_errors_fatal 11000 1726867151.70670: done checking for any_errors_fatal 11000 1726867151.70670: checking for max_fail_percentage 11000 1726867151.70672: done checking for max_fail_percentage 11000 1726867151.70672: checking to see if all hosts have failed and the running result is not ok 11000 1726867151.70673: done checking to see if all hosts have failed 11000 1726867151.70674: getting the remaining hosts for this loop 11000 1726867151.70675: done getting the remaining hosts for this loop 11000 1726867151.70680: getting the next task for host managed_node1 11000 1726867151.70689: done getting next task for host managed_node1 11000 1726867151.70693: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11000 1726867151.70695: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867151.70705: getting variables 11000 1726867151.70706: in VariableManager get_vars() 11000 1726867151.70737: Calling all_inventory to load vars for managed_node1 11000 1726867151.70739: Calling groups_inventory to load vars for managed_node1 11000 1726867151.70741: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867151.70750: Calling all_plugins_play to load vars for managed_node1 11000 1726867151.70752: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867151.70755: Calling groups_plugins_play to load vars for managed_node1 11000 1726867151.73137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867151.76408: done with get_vars() 11000 1726867151.76547: done getting variables 11000 1726867151.76613: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:19:11 -0400 (0:00:01.215) 0:00:13.410 ****** 11000 1726867151.76762: entering _queue_task() for managed_node1/debug 11000 1726867151.77559: worker is 1 (out of 1 available) 11000 1726867151.77572: exiting _queue_task() for managed_node1/debug 11000 1726867151.77588: done queuing things up, now waiting for results queue to drain 11000 1726867151.77590: waiting for pending results... 11000 1726867151.77972: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11000 1726867151.78205: in run() - task 0affcac9-a3a5-c734-026a-000000000027 11000 1726867151.78226: variable 'ansible_search_path' from source: unknown 11000 1726867151.78398: variable 'ansible_search_path' from source: unknown 11000 1726867151.78402: calling self._execute() 11000 1726867151.78531: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867151.78544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867151.78598: variable 'omit' from source: magic vars 11000 1726867151.79420: variable 'ansible_distribution_major_version' from source: facts 11000 1726867151.79440: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867151.79451: variable 'omit' from source: magic vars 11000 1726867151.79707: variable 'omit' from source: magic vars 11000 1726867151.79788: variable 'network_provider' from source: set_fact 11000 1726867151.79943: variable 'omit' from source: magic vars 11000 1726867151.79991: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867151.80042: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867151.80130: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867151.80160: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867151.80180: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867151.80360: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867151.80363: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867151.80365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867151.80578: Set connection var ansible_shell_type to sh 11000 1726867151.80582: Set connection var ansible_pipelining to False 11000 1726867151.80584: Set connection var ansible_shell_executable to /bin/sh 11000 1726867151.80588: Set connection var ansible_connection to ssh 11000 1726867151.80590: Set connection var ansible_timeout to 10 11000 1726867151.80601: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867151.80630: variable 'ansible_shell_executable' from source: unknown 11000 1726867151.80639: variable 'ansible_connection' from source: unknown 11000 1726867151.80689: variable 'ansible_module_compression' from source: unknown 11000 1726867151.80699: variable 'ansible_shell_type' from source: unknown 11000 1726867151.80706: variable 'ansible_shell_executable' from source: unknown 11000 1726867151.80713: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867151.80720: variable 'ansible_pipelining' from source: unknown 11000 1726867151.80726: variable 'ansible_timeout' from source: unknown 11000 1726867151.80733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867151.81035: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867151.81121: variable 'omit' from source: magic vars 11000 1726867151.81125: starting attempt loop 11000 1726867151.81127: running the handler 11000 1726867151.81232: handler run complete 11000 1726867151.81235: attempt loop complete, returning result 11000 1726867151.81237: _execute() done 11000 1726867151.81240: dumping result to json 11000 1726867151.81242: done dumping result, returning 11000 1726867151.81340: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-c734-026a-000000000027] 11000 1726867151.81343: sending task result for task 0affcac9-a3a5-c734-026a-000000000027 11000 1726867151.81488: done sending task result for task 0affcac9-a3a5-c734-026a-000000000027 11000 1726867151.81491: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 11000 1726867151.81579: no more pending results, returning what we have 11000 1726867151.81583: results queue empty 11000 1726867151.81584: checking for any_errors_fatal 11000 1726867151.81598: done checking for any_errors_fatal 11000 1726867151.81599: checking for max_fail_percentage 11000 1726867151.81601: done checking for max_fail_percentage 11000 1726867151.81601: checking to see if all hosts have failed and the running result is not ok 11000 1726867151.81602: done checking to see if all hosts have failed 11000 1726867151.81603: getting the remaining hosts for this loop 11000 1726867151.81604: done getting the remaining hosts for this loop 11000 1726867151.81608: getting the next task for host managed_node1 11000 1726867151.81616: done getting next task for host managed_node1 11000 1726867151.81620: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11000 1726867151.81623: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867151.81635: getting variables 11000 1726867151.81637: in VariableManager get_vars() 11000 1726867151.81901: Calling all_inventory to load vars for managed_node1 11000 1726867151.81905: Calling groups_inventory to load vars for managed_node1 11000 1726867151.81909: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867151.81920: Calling all_plugins_play to load vars for managed_node1 11000 1726867151.81923: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867151.81927: Calling groups_plugins_play to load vars for managed_node1 11000 1726867151.84957: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867151.88446: done with get_vars() 11000 1726867151.88474: done getting variables 11000 1726867151.88640: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:19:11 -0400 (0:00:00.120) 0:00:13.530 ****** 11000 1726867151.88793: entering _queue_task() for managed_node1/fail 11000 1726867151.88795: Creating lock for fail 11000 1726867151.89560: worker is 1 (out of 1 available) 11000 1726867151.89571: exiting _queue_task() for managed_node1/fail 11000 1726867151.89583: done queuing things up, now waiting for results queue to drain 11000 1726867151.89584: waiting for pending results... 11000 1726867151.90115: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11000 1726867151.90212: in run() - task 0affcac9-a3a5-c734-026a-000000000028 11000 1726867151.90217: variable 'ansible_search_path' from source: unknown 11000 1726867151.90220: variable 'ansible_search_path' from source: unknown 11000 1726867151.90317: calling self._execute() 11000 1726867151.90543: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867151.90547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867151.90550: variable 'omit' from source: magic vars 11000 1726867151.91316: variable 'ansible_distribution_major_version' from source: facts 11000 1726867151.91332: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867151.91561: variable 'network_state' from source: role '' defaults 11000 1726867151.91735: Evaluated conditional (network_state != {}): False 11000 1726867151.91738: when evaluation is False, skipping this task 11000 1726867151.91741: _execute() done 11000 1726867151.91744: dumping result to json 11000 1726867151.91746: done dumping result, returning 11000 1726867151.91749: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-c734-026a-000000000028] 11000 1726867151.91752: sending task result for task 0affcac9-a3a5-c734-026a-000000000028 11000 1726867151.91821: done sending task result for task 0affcac9-a3a5-c734-026a-000000000028 11000 1726867151.91824: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867151.91885: no more pending results, returning what we have 11000 1726867151.91891: results queue empty 11000 1726867151.91892: checking for any_errors_fatal 11000 1726867151.91898: done checking for any_errors_fatal 11000 1726867151.91899: checking for max_fail_percentage 11000 1726867151.91900: done checking for max_fail_percentage 11000 1726867151.91902: checking to see if all hosts have failed and the running result is not ok 11000 1726867151.91903: done checking to see if all hosts have failed 11000 1726867151.91903: getting the remaining hosts for this loop 11000 1726867151.91905: done getting the remaining hosts for this loop 11000 1726867151.91908: getting the next task for host managed_node1 11000 1726867151.91914: done getting next task for host managed_node1 11000 1726867151.91918: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11000 1726867151.91922: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867151.91937: getting variables 11000 1726867151.91939: in VariableManager get_vars() 11000 1726867151.91979: Calling all_inventory to load vars for managed_node1 11000 1726867151.91983: Calling groups_inventory to load vars for managed_node1 11000 1726867151.91985: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867151.91999: Calling all_plugins_play to load vars for managed_node1 11000 1726867151.92002: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867151.92005: Calling groups_plugins_play to load vars for managed_node1 11000 1726867151.94802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867151.98027: done with get_vars() 11000 1726867151.98048: done getting variables 11000 1726867151.98115: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:19:11 -0400 (0:00:00.093) 0:00:13.624 ****** 11000 1726867151.98149: entering _queue_task() for managed_node1/fail 11000 1726867151.98580: worker is 1 (out of 1 available) 11000 1726867151.98590: exiting _queue_task() for managed_node1/fail 11000 1726867151.98602: done queuing things up, now waiting for results queue to drain 11000 1726867151.98603: waiting for pending results... 11000 1726867151.98839: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11000 1726867151.98913: in run() - task 0affcac9-a3a5-c734-026a-000000000029 11000 1726867151.98933: variable 'ansible_search_path' from source: unknown 11000 1726867151.98955: variable 'ansible_search_path' from source: unknown 11000 1726867151.98991: calling self._execute() 11000 1726867151.99093: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867151.99096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867151.99174: variable 'omit' from source: magic vars 11000 1726867151.99772: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.00015: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.00117: variable 'network_state' from source: role '' defaults 11000 1726867152.00236: Evaluated conditional (network_state != {}): False 11000 1726867152.00239: when evaluation is False, skipping this task 11000 1726867152.00241: _execute() done 11000 1726867152.00244: dumping result to json 11000 1726867152.00246: done dumping result, returning 11000 1726867152.00249: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-c734-026a-000000000029] 11000 1726867152.00252: sending task result for task 0affcac9-a3a5-c734-026a-000000000029 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867152.00500: no more pending results, returning what we have 11000 1726867152.00503: results queue empty 11000 1726867152.00504: checking for any_errors_fatal 11000 1726867152.00513: done checking for any_errors_fatal 11000 1726867152.00514: checking for max_fail_percentage 11000 1726867152.00515: done checking for max_fail_percentage 11000 1726867152.00516: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.00518: done checking to see if all hosts have failed 11000 1726867152.00518: getting the remaining hosts for this loop 11000 1726867152.00519: done getting the remaining hosts for this loop 11000 1726867152.00523: getting the next task for host managed_node1 11000 1726867152.00530: done getting next task for host managed_node1 11000 1726867152.00533: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11000 1726867152.00537: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.00553: getting variables 11000 1726867152.00555: in VariableManager get_vars() 11000 1726867152.00595: Calling all_inventory to load vars for managed_node1 11000 1726867152.00599: Calling groups_inventory to load vars for managed_node1 11000 1726867152.00601: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.00613: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.00615: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.00618: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.01301: done sending task result for task 0affcac9-a3a5-c734-026a-000000000029 11000 1726867152.01305: WORKER PROCESS EXITING 11000 1726867152.02223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867152.03895: done with get_vars() 11000 1726867152.03918: done getting variables 11000 1726867152.04092: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:19:12 -0400 (0:00:00.059) 0:00:13.684 ****** 11000 1726867152.04123: entering _queue_task() for managed_node1/fail 11000 1726867152.04770: worker is 1 (out of 1 available) 11000 1726867152.04931: exiting _queue_task() for managed_node1/fail 11000 1726867152.04944: done queuing things up, now waiting for results queue to drain 11000 1726867152.04945: waiting for pending results... 11000 1726867152.05493: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11000 1726867152.06184: in run() - task 0affcac9-a3a5-c734-026a-00000000002a 11000 1726867152.06190: variable 'ansible_search_path' from source: unknown 11000 1726867152.06193: variable 'ansible_search_path' from source: unknown 11000 1726867152.06195: calling self._execute() 11000 1726867152.06359: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867152.06440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867152.06562: variable 'omit' from source: magic vars 11000 1726867152.07024: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.07042: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.07227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867152.10535: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867152.10614: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867152.10651: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867152.10688: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867152.10726: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867152.10817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.10853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.10885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.10939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.10958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.11065: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.11089: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11000 1726867152.11215: variable 'ansible_distribution' from source: facts 11000 1726867152.11225: variable '__network_rh_distros' from source: role '' defaults 11000 1726867152.11246: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11000 1726867152.11569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.11572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.11575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.11612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.11631: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.11683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.11783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.11786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.11788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.11810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.11856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.11887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.11923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.11966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.11986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.12316: variable 'network_connections' from source: task vars 11000 1726867152.12337: variable 'controller_profile' from source: play vars 11000 1726867152.12462: variable 'controller_profile' from source: play vars 11000 1726867152.12726: variable 'controller_device' from source: play vars 11000 1726867152.12729: variable 'controller_device' from source: play vars 11000 1726867152.12732: variable 'port1_profile' from source: play vars 11000 1726867152.12734: variable 'port1_profile' from source: play vars 11000 1726867152.12736: variable 'dhcp_interface1' from source: play vars 11000 1726867152.12739: variable 'dhcp_interface1' from source: play vars 11000 1726867152.12741: variable 'controller_profile' from source: play vars 11000 1726867152.13037: variable 'controller_profile' from source: play vars 11000 1726867152.13089: variable 'port2_profile' from source: play vars 11000 1726867152.13294: variable 'port2_profile' from source: play vars 11000 1726867152.13298: variable 'dhcp_interface2' from source: play vars 11000 1726867152.13327: variable 'dhcp_interface2' from source: play vars 11000 1726867152.13392: variable 'controller_profile' from source: play vars 11000 1726867152.13482: variable 'controller_profile' from source: play vars 11000 1726867152.13785: variable 'network_state' from source: role '' defaults 11000 1726867152.13789: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867152.14045: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867152.14120: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867152.14155: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867152.14250: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867152.14386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867152.14413: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867152.14450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.14510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867152.14656: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11000 1726867152.14664: when evaluation is False, skipping this task 11000 1726867152.14670: _execute() done 11000 1726867152.14680: dumping result to json 11000 1726867152.14688: done dumping result, returning 11000 1726867152.14700: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-c734-026a-00000000002a] 11000 1726867152.14709: sending task result for task 0affcac9-a3a5-c734-026a-00000000002a skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11000 1726867152.15020: no more pending results, returning what we have 11000 1726867152.15024: results queue empty 11000 1726867152.15025: checking for any_errors_fatal 11000 1726867152.15030: done checking for any_errors_fatal 11000 1726867152.15031: checking for max_fail_percentage 11000 1726867152.15033: done checking for max_fail_percentage 11000 1726867152.15034: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.15035: done checking to see if all hosts have failed 11000 1726867152.15035: getting the remaining hosts for this loop 11000 1726867152.15037: done getting the remaining hosts for this loop 11000 1726867152.15041: getting the next task for host managed_node1 11000 1726867152.15048: done getting next task for host managed_node1 11000 1726867152.15051: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11000 1726867152.15054: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.15070: getting variables 11000 1726867152.15072: in VariableManager get_vars() 11000 1726867152.15115: Calling all_inventory to load vars for managed_node1 11000 1726867152.15118: Calling groups_inventory to load vars for managed_node1 11000 1726867152.15120: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.15130: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.15133: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.15136: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.15841: done sending task result for task 0affcac9-a3a5-c734-026a-00000000002a 11000 1726867152.15845: WORKER PROCESS EXITING 11000 1726867152.18524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867152.20891: done with get_vars() 11000 1726867152.20919: done getting variables 11000 1726867152.21025: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:19:12 -0400 (0:00:00.170) 0:00:13.854 ****** 11000 1726867152.21173: entering _queue_task() for managed_node1/dnf 11000 1726867152.21840: worker is 1 (out of 1 available) 11000 1726867152.21852: exiting _queue_task() for managed_node1/dnf 11000 1726867152.21865: done queuing things up, now waiting for results queue to drain 11000 1726867152.21866: waiting for pending results... 11000 1726867152.22382: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11000 1726867152.22983: in run() - task 0affcac9-a3a5-c734-026a-00000000002b 11000 1726867152.22987: variable 'ansible_search_path' from source: unknown 11000 1726867152.22990: variable 'ansible_search_path' from source: unknown 11000 1726867152.22993: calling self._execute() 11000 1726867152.23369: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867152.23584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867152.23588: variable 'omit' from source: magic vars 11000 1726867152.24374: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.24629: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.24993: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867152.29340: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867152.29557: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867152.29749: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867152.29790: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867152.29891: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867152.29976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.30069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.30173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.30222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.30270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.30685: variable 'ansible_distribution' from source: facts 11000 1726867152.30689: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.30692: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11000 1726867152.30834: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867152.31120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.31154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.31208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.31283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.31569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.31572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.31574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.31576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.31690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.31709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.31750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.31811: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.31913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.31955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.32019: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.32337: variable 'network_connections' from source: task vars 11000 1726867152.32355: variable 'controller_profile' from source: play vars 11000 1726867152.32501: variable 'controller_profile' from source: play vars 11000 1726867152.32559: variable 'controller_device' from source: play vars 11000 1726867152.32624: variable 'controller_device' from source: play vars 11000 1726867152.32694: variable 'port1_profile' from source: play vars 11000 1726867152.32824: variable 'port1_profile' from source: play vars 11000 1726867152.32885: variable 'dhcp_interface1' from source: play vars 11000 1726867152.33082: variable 'dhcp_interface1' from source: play vars 11000 1726867152.33087: variable 'controller_profile' from source: play vars 11000 1726867152.33129: variable 'controller_profile' from source: play vars 11000 1726867152.33140: variable 'port2_profile' from source: play vars 11000 1726867152.33261: variable 'port2_profile' from source: play vars 11000 1726867152.33322: variable 'dhcp_interface2' from source: play vars 11000 1726867152.33425: variable 'dhcp_interface2' from source: play vars 11000 1726867152.33439: variable 'controller_profile' from source: play vars 11000 1726867152.33886: variable 'controller_profile' from source: play vars 11000 1726867152.33898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867152.34067: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867152.34137: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867152.34245: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867152.34275: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867152.34359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867152.34456: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867152.34546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.34573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867152.34636: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867152.35209: variable 'network_connections' from source: task vars 11000 1726867152.35219: variable 'controller_profile' from source: play vars 11000 1726867152.35281: variable 'controller_profile' from source: play vars 11000 1726867152.35522: variable 'controller_device' from source: play vars 11000 1726867152.35525: variable 'controller_device' from source: play vars 11000 1726867152.35526: variable 'port1_profile' from source: play vars 11000 1726867152.35607: variable 'port1_profile' from source: play vars 11000 1726867152.35620: variable 'dhcp_interface1' from source: play vars 11000 1726867152.35847: variable 'dhcp_interface1' from source: play vars 11000 1726867152.35850: variable 'controller_profile' from source: play vars 11000 1726867152.35912: variable 'controller_profile' from source: play vars 11000 1726867152.35924: variable 'port2_profile' from source: play vars 11000 1726867152.36110: variable 'port2_profile' from source: play vars 11000 1726867152.36113: variable 'dhcp_interface2' from source: play vars 11000 1726867152.36220: variable 'dhcp_interface2' from source: play vars 11000 1726867152.36223: variable 'controller_profile' from source: play vars 11000 1726867152.36343: variable 'controller_profile' from source: play vars 11000 1726867152.36426: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11000 1726867152.36505: when evaluation is False, skipping this task 11000 1726867152.36546: _execute() done 11000 1726867152.36549: dumping result to json 11000 1726867152.36551: done dumping result, returning 11000 1726867152.36552: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-00000000002b] 11000 1726867152.36554: sending task result for task 0affcac9-a3a5-c734-026a-00000000002b 11000 1726867152.36828: done sending task result for task 0affcac9-a3a5-c734-026a-00000000002b 11000 1726867152.36832: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11000 1726867152.36887: no more pending results, returning what we have 11000 1726867152.36890: results queue empty 11000 1726867152.36891: checking for any_errors_fatal 11000 1726867152.36897: done checking for any_errors_fatal 11000 1726867152.36898: checking for max_fail_percentage 11000 1726867152.36900: done checking for max_fail_percentage 11000 1726867152.36901: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.36902: done checking to see if all hosts have failed 11000 1726867152.36902: getting the remaining hosts for this loop 11000 1726867152.36904: done getting the remaining hosts for this loop 11000 1726867152.36908: getting the next task for host managed_node1 11000 1726867152.36914: done getting next task for host managed_node1 11000 1726867152.36918: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11000 1726867152.36921: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.36938: getting variables 11000 1726867152.36939: in VariableManager get_vars() 11000 1726867152.36981: Calling all_inventory to load vars for managed_node1 11000 1726867152.36984: Calling groups_inventory to load vars for managed_node1 11000 1726867152.36986: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.36997: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.37000: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.37002: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.40063: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867152.41710: done with get_vars() 11000 1726867152.41737: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11000 1726867152.41817: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:19:12 -0400 (0:00:00.206) 0:00:14.061 ****** 11000 1726867152.41853: entering _queue_task() for managed_node1/yum 11000 1726867152.41855: Creating lock for yum 11000 1726867152.42543: worker is 1 (out of 1 available) 11000 1726867152.42554: exiting _queue_task() for managed_node1/yum 11000 1726867152.42567: done queuing things up, now waiting for results queue to drain 11000 1726867152.42569: waiting for pending results... 11000 1726867152.43224: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11000 1726867152.43262: in run() - task 0affcac9-a3a5-c734-026a-00000000002c 11000 1726867152.43278: variable 'ansible_search_path' from source: unknown 11000 1726867152.43282: variable 'ansible_search_path' from source: unknown 11000 1726867152.43648: calling self._execute() 11000 1726867152.43652: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867152.43655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867152.43658: variable 'omit' from source: magic vars 11000 1726867152.44152: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.44179: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.44362: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867152.47774: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867152.47839: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867152.47996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867152.48027: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867152.48052: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867152.48240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.48267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.48298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.48582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.48586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.48734: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.48768: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11000 1726867152.48772: when evaluation is False, skipping this task 11000 1726867152.48788: _execute() done 11000 1726867152.48795: dumping result to json 11000 1726867152.48798: done dumping result, returning 11000 1726867152.48806: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-00000000002c] 11000 1726867152.48809: sending task result for task 0affcac9-a3a5-c734-026a-00000000002c 11000 1726867152.48916: done sending task result for task 0affcac9-a3a5-c734-026a-00000000002c 11000 1726867152.48920: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11000 1726867152.48973: no more pending results, returning what we have 11000 1726867152.48978: results queue empty 11000 1726867152.48979: checking for any_errors_fatal 11000 1726867152.48984: done checking for any_errors_fatal 11000 1726867152.48985: checking for max_fail_percentage 11000 1726867152.48989: done checking for max_fail_percentage 11000 1726867152.48990: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.48991: done checking to see if all hosts have failed 11000 1726867152.48991: getting the remaining hosts for this loop 11000 1726867152.48993: done getting the remaining hosts for this loop 11000 1726867152.48996: getting the next task for host managed_node1 11000 1726867152.49003: done getting next task for host managed_node1 11000 1726867152.49006: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11000 1726867152.49009: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.49024: getting variables 11000 1726867152.49026: in VariableManager get_vars() 11000 1726867152.49073: Calling all_inventory to load vars for managed_node1 11000 1726867152.49180: Calling groups_inventory to load vars for managed_node1 11000 1726867152.49185: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.49203: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.49207: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.49210: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.50994: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867152.53058: done with get_vars() 11000 1726867152.53201: done getting variables 11000 1726867152.53257: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:19:12 -0400 (0:00:00.115) 0:00:14.177 ****** 11000 1726867152.53431: entering _queue_task() for managed_node1/fail 11000 1726867152.54113: worker is 1 (out of 1 available) 11000 1726867152.54127: exiting _queue_task() for managed_node1/fail 11000 1726867152.54159: done queuing things up, now waiting for results queue to drain 11000 1726867152.54160: waiting for pending results... 11000 1726867152.54522: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11000 1726867152.54530: in run() - task 0affcac9-a3a5-c734-026a-00000000002d 11000 1726867152.54550: variable 'ansible_search_path' from source: unknown 11000 1726867152.54557: variable 'ansible_search_path' from source: unknown 11000 1726867152.54599: calling self._execute() 11000 1726867152.54694: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867152.54705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867152.54718: variable 'omit' from source: magic vars 11000 1726867152.55104: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.55120: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.55249: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867152.55497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867152.57851: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867152.57931: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867152.58183: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867152.58234: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867152.58265: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867152.58490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.58494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.58497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.58674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.58707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.58759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.58923: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.58927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.58967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.59049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.59098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.59166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.59274: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.59326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.59373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.59761: variable 'network_connections' from source: task vars 11000 1726867152.59939: variable 'controller_profile' from source: play vars 11000 1726867152.59942: variable 'controller_profile' from source: play vars 11000 1726867152.59945: variable 'controller_device' from source: play vars 11000 1726867152.60110: variable 'controller_device' from source: play vars 11000 1726867152.60188: variable 'port1_profile' from source: play vars 11000 1726867152.60305: variable 'port1_profile' from source: play vars 11000 1726867152.60317: variable 'dhcp_interface1' from source: play vars 11000 1726867152.60389: variable 'dhcp_interface1' from source: play vars 11000 1726867152.60407: variable 'controller_profile' from source: play vars 11000 1726867152.60490: variable 'controller_profile' from source: play vars 11000 1726867152.60503: variable 'port2_profile' from source: play vars 11000 1726867152.60565: variable 'port2_profile' from source: play vars 11000 1726867152.60581: variable 'dhcp_interface2' from source: play vars 11000 1726867152.60648: variable 'dhcp_interface2' from source: play vars 11000 1726867152.60661: variable 'controller_profile' from source: play vars 11000 1726867152.60730: variable 'controller_profile' from source: play vars 11000 1726867152.60812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867152.61006: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867152.61054: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867152.61094: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867152.61128: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867152.61185: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867152.61247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867152.61250: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.61279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867152.61382: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867152.61611: variable 'network_connections' from source: task vars 11000 1726867152.61621: variable 'controller_profile' from source: play vars 11000 1726867152.61695: variable 'controller_profile' from source: play vars 11000 1726867152.61708: variable 'controller_device' from source: play vars 11000 1726867152.61796: variable 'controller_device' from source: play vars 11000 1726867152.61799: variable 'port1_profile' from source: play vars 11000 1726867152.61874: variable 'port1_profile' from source: play vars 11000 1726867152.61924: variable 'dhcp_interface1' from source: play vars 11000 1726867152.61983: variable 'dhcp_interface1' from source: play vars 11000 1726867152.61999: variable 'controller_profile' from source: play vars 11000 1726867152.62120: variable 'controller_profile' from source: play vars 11000 1726867152.62123: variable 'port2_profile' from source: play vars 11000 1726867152.62185: variable 'port2_profile' from source: play vars 11000 1726867152.62199: variable 'dhcp_interface2' from source: play vars 11000 1726867152.62337: variable 'dhcp_interface2' from source: play vars 11000 1726867152.62341: variable 'controller_profile' from source: play vars 11000 1726867152.62346: variable 'controller_profile' from source: play vars 11000 1726867152.62390: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11000 1726867152.62400: when evaluation is False, skipping this task 11000 1726867152.62407: _execute() done 11000 1726867152.62415: dumping result to json 11000 1726867152.62423: done dumping result, returning 11000 1726867152.62444: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-00000000002d] 11000 1726867152.62455: sending task result for task 0affcac9-a3a5-c734-026a-00000000002d skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11000 1726867152.62720: no more pending results, returning what we have 11000 1726867152.62723: results queue empty 11000 1726867152.62724: checking for any_errors_fatal 11000 1726867152.62729: done checking for any_errors_fatal 11000 1726867152.62730: checking for max_fail_percentage 11000 1726867152.62732: done checking for max_fail_percentage 11000 1726867152.62733: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.62734: done checking to see if all hosts have failed 11000 1726867152.62734: getting the remaining hosts for this loop 11000 1726867152.62736: done getting the remaining hosts for this loop 11000 1726867152.62740: getting the next task for host managed_node1 11000 1726867152.62746: done getting next task for host managed_node1 11000 1726867152.62750: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11000 1726867152.62753: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.62766: getting variables 11000 1726867152.62768: in VariableManager get_vars() 11000 1726867152.63011: Calling all_inventory to load vars for managed_node1 11000 1726867152.63015: Calling groups_inventory to load vars for managed_node1 11000 1726867152.63017: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.63026: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.63028: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.63031: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.63607: done sending task result for task 0affcac9-a3a5-c734-026a-00000000002d 11000 1726867152.63611: WORKER PROCESS EXITING 11000 1726867152.65284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867152.66620: done with get_vars() 11000 1726867152.66636: done getting variables 11000 1726867152.66680: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:19:12 -0400 (0:00:00.132) 0:00:14.310 ****** 11000 1726867152.66707: entering _queue_task() for managed_node1/package 11000 1726867152.66939: worker is 1 (out of 1 available) 11000 1726867152.66951: exiting _queue_task() for managed_node1/package 11000 1726867152.66963: done queuing things up, now waiting for results queue to drain 11000 1726867152.66964: waiting for pending results... 11000 1726867152.67162: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11000 1726867152.67295: in run() - task 0affcac9-a3a5-c734-026a-00000000002e 11000 1726867152.67320: variable 'ansible_search_path' from source: unknown 11000 1726867152.67413: variable 'ansible_search_path' from source: unknown 11000 1726867152.67416: calling self._execute() 11000 1726867152.67455: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867152.67465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867152.67476: variable 'omit' from source: magic vars 11000 1726867152.67848: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.67868: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.68030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867152.68223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867152.68255: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867152.68282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867152.68316: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867152.68589: variable 'network_packages' from source: role '' defaults 11000 1726867152.68592: variable '__network_provider_setup' from source: role '' defaults 11000 1726867152.68595: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867152.68618: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867152.68633: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867152.68708: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867152.68899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867152.70606: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867152.70654: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867152.70684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867152.70709: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867152.70729: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867152.70787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.70810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.70828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.70856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.70866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.70905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.70921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.70939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.70964: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.70974: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.71123: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11000 1726867152.71197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.71215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.71236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.71260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.71271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.71337: variable 'ansible_python' from source: facts 11000 1726867152.71353: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11000 1726867152.71411: variable '__network_wpa_supplicant_required' from source: role '' defaults 11000 1726867152.71466: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11000 1726867152.71552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.71567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.71587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.71614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.71624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.71660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867152.71679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867152.71699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.71725: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867152.71735: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867152.71834: variable 'network_connections' from source: task vars 11000 1726867152.71838: variable 'controller_profile' from source: play vars 11000 1726867152.71913: variable 'controller_profile' from source: play vars 11000 1726867152.71922: variable 'controller_device' from source: play vars 11000 1726867152.71989: variable 'controller_device' from source: play vars 11000 1726867152.72002: variable 'port1_profile' from source: play vars 11000 1726867152.72069: variable 'port1_profile' from source: play vars 11000 1726867152.72078: variable 'dhcp_interface1' from source: play vars 11000 1726867152.72150: variable 'dhcp_interface1' from source: play vars 11000 1726867152.72158: variable 'controller_profile' from source: play vars 11000 1726867152.72230: variable 'controller_profile' from source: play vars 11000 1726867152.72237: variable 'port2_profile' from source: play vars 11000 1726867152.72311: variable 'port2_profile' from source: play vars 11000 1726867152.72315: variable 'dhcp_interface2' from source: play vars 11000 1726867152.72385: variable 'dhcp_interface2' from source: play vars 11000 1726867152.72396: variable 'controller_profile' from source: play vars 11000 1726867152.72468: variable 'controller_profile' from source: play vars 11000 1726867152.72522: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867152.72546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867152.72564: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867152.72588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867152.72626: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867152.72810: variable 'network_connections' from source: task vars 11000 1726867152.72813: variable 'controller_profile' from source: play vars 11000 1726867152.72887: variable 'controller_profile' from source: play vars 11000 1726867152.72899: variable 'controller_device' from source: play vars 11000 1726867152.72965: variable 'controller_device' from source: play vars 11000 1726867152.72972: variable 'port1_profile' from source: play vars 11000 1726867152.73043: variable 'port1_profile' from source: play vars 11000 1726867152.73051: variable 'dhcp_interface1' from source: play vars 11000 1726867152.73125: variable 'dhcp_interface1' from source: play vars 11000 1726867152.73132: variable 'controller_profile' from source: play vars 11000 1726867152.73204: variable 'controller_profile' from source: play vars 11000 1726867152.73211: variable 'port2_profile' from source: play vars 11000 1726867152.73280: variable 'port2_profile' from source: play vars 11000 1726867152.73292: variable 'dhcp_interface2' from source: play vars 11000 1726867152.73361: variable 'dhcp_interface2' from source: play vars 11000 1726867152.73368: variable 'controller_profile' from source: play vars 11000 1726867152.73442: variable 'controller_profile' from source: play vars 11000 1726867152.73481: variable '__network_packages_default_wireless' from source: role '' defaults 11000 1726867152.73540: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867152.73784: variable 'network_connections' from source: task vars 11000 1726867152.73791: variable 'controller_profile' from source: play vars 11000 1726867152.73836: variable 'controller_profile' from source: play vars 11000 1726867152.73842: variable 'controller_device' from source: play vars 11000 1726867152.73920: variable 'controller_device' from source: play vars 11000 1726867152.73923: variable 'port1_profile' from source: play vars 11000 1726867152.73955: variable 'port1_profile' from source: play vars 11000 1726867152.73961: variable 'dhcp_interface1' from source: play vars 11000 1726867152.74014: variable 'dhcp_interface1' from source: play vars 11000 1726867152.74019: variable 'controller_profile' from source: play vars 11000 1726867152.74064: variable 'controller_profile' from source: play vars 11000 1726867152.74071: variable 'port2_profile' from source: play vars 11000 1726867152.74120: variable 'port2_profile' from source: play vars 11000 1726867152.74126: variable 'dhcp_interface2' from source: play vars 11000 1726867152.74170: variable 'dhcp_interface2' from source: play vars 11000 1726867152.74175: variable 'controller_profile' from source: play vars 11000 1726867152.74226: variable 'controller_profile' from source: play vars 11000 1726867152.74245: variable '__network_packages_default_team' from source: role '' defaults 11000 1726867152.74305: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867152.74525: variable 'network_connections' from source: task vars 11000 1726867152.74531: variable 'controller_profile' from source: play vars 11000 1726867152.74576: variable 'controller_profile' from source: play vars 11000 1726867152.74584: variable 'controller_device' from source: play vars 11000 1726867152.74633: variable 'controller_device' from source: play vars 11000 1726867152.74642: variable 'port1_profile' from source: play vars 11000 1726867152.74686: variable 'port1_profile' from source: play vars 11000 1726867152.74694: variable 'dhcp_interface1' from source: play vars 11000 1726867152.74740: variable 'dhcp_interface1' from source: play vars 11000 1726867152.74746: variable 'controller_profile' from source: play vars 11000 1726867152.74794: variable 'controller_profile' from source: play vars 11000 1726867152.74800: variable 'port2_profile' from source: play vars 11000 1726867152.74847: variable 'port2_profile' from source: play vars 11000 1726867152.74853: variable 'dhcp_interface2' from source: play vars 11000 1726867152.74901: variable 'dhcp_interface2' from source: play vars 11000 1726867152.74908: variable 'controller_profile' from source: play vars 11000 1726867152.74953: variable 'controller_profile' from source: play vars 11000 1726867152.75261: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867152.75265: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867152.75267: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867152.75269: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867152.75374: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11000 1726867152.75864: variable 'network_connections' from source: task vars 11000 1726867152.75867: variable 'controller_profile' from source: play vars 11000 1726867152.76131: variable 'controller_profile' from source: play vars 11000 1726867152.76137: variable 'controller_device' from source: play vars 11000 1726867152.76198: variable 'controller_device' from source: play vars 11000 1726867152.76207: variable 'port1_profile' from source: play vars 11000 1726867152.76265: variable 'port1_profile' from source: play vars 11000 1726867152.76272: variable 'dhcp_interface1' from source: play vars 11000 1726867152.76541: variable 'dhcp_interface1' from source: play vars 11000 1726867152.76546: variable 'controller_profile' from source: play vars 11000 1726867152.76617: variable 'controller_profile' from source: play vars 11000 1726867152.76621: variable 'port2_profile' from source: play vars 11000 1726867152.76696: variable 'port2_profile' from source: play vars 11000 1726867152.76699: variable 'dhcp_interface2' from source: play vars 11000 1726867152.76959: variable 'dhcp_interface2' from source: play vars 11000 1726867152.76980: variable 'controller_profile' from source: play vars 11000 1726867152.77043: variable 'controller_profile' from source: play vars 11000 1726867152.77060: variable 'ansible_distribution' from source: facts 11000 1726867152.77081: variable '__network_rh_distros' from source: role '' defaults 11000 1726867152.77084: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.77111: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11000 1726867152.77228: variable 'ansible_distribution' from source: facts 11000 1726867152.77233: variable '__network_rh_distros' from source: role '' defaults 11000 1726867152.77236: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.77238: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11000 1726867152.77358: variable 'ansible_distribution' from source: facts 11000 1726867152.77361: variable '__network_rh_distros' from source: role '' defaults 11000 1726867152.77364: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.77396: variable 'network_provider' from source: set_fact 11000 1726867152.77409: variable 'ansible_facts' from source: unknown 11000 1726867152.77779: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11000 1726867152.77782: when evaluation is False, skipping this task 11000 1726867152.77785: _execute() done 11000 1726867152.77787: dumping result to json 11000 1726867152.77792: done dumping result, returning 11000 1726867152.77800: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-c734-026a-00000000002e] 11000 1726867152.77806: sending task result for task 0affcac9-a3a5-c734-026a-00000000002e 11000 1726867152.77889: done sending task result for task 0affcac9-a3a5-c734-026a-00000000002e 11000 1726867152.77892: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11000 1726867152.77956: no more pending results, returning what we have 11000 1726867152.77959: results queue empty 11000 1726867152.77960: checking for any_errors_fatal 11000 1726867152.77965: done checking for any_errors_fatal 11000 1726867152.77966: checking for max_fail_percentage 11000 1726867152.77967: done checking for max_fail_percentage 11000 1726867152.77968: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.77969: done checking to see if all hosts have failed 11000 1726867152.77969: getting the remaining hosts for this loop 11000 1726867152.77971: done getting the remaining hosts for this loop 11000 1726867152.77974: getting the next task for host managed_node1 11000 1726867152.77982: done getting next task for host managed_node1 11000 1726867152.77985: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11000 1726867152.77988: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.78001: getting variables 11000 1726867152.78002: in VariableManager get_vars() 11000 1726867152.78040: Calling all_inventory to load vars for managed_node1 11000 1726867152.78043: Calling groups_inventory to load vars for managed_node1 11000 1726867152.78045: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.78054: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.78057: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.78059: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.78964: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867152.80590: done with get_vars() 11000 1726867152.80610: done getting variables 11000 1726867152.80673: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:19:12 -0400 (0:00:00.139) 0:00:14.450 ****** 11000 1726867152.80709: entering _queue_task() for managed_node1/package 11000 1726867152.80983: worker is 1 (out of 1 available) 11000 1726867152.80997: exiting _queue_task() for managed_node1/package 11000 1726867152.81009: done queuing things up, now waiting for results queue to drain 11000 1726867152.81010: waiting for pending results... 11000 1726867152.81401: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11000 1726867152.81406: in run() - task 0affcac9-a3a5-c734-026a-00000000002f 11000 1726867152.81409: variable 'ansible_search_path' from source: unknown 11000 1726867152.81412: variable 'ansible_search_path' from source: unknown 11000 1726867152.81436: calling self._execute() 11000 1726867152.81529: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867152.81541: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867152.81554: variable 'omit' from source: magic vars 11000 1726867152.81932: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.81951: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.82088: variable 'network_state' from source: role '' defaults 11000 1726867152.82104: Evaluated conditional (network_state != {}): False 11000 1726867152.82111: when evaluation is False, skipping this task 11000 1726867152.82117: _execute() done 11000 1726867152.82144: dumping result to json 11000 1726867152.82147: done dumping result, returning 11000 1726867152.82150: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-c734-026a-00000000002f] 11000 1726867152.82153: sending task result for task 0affcac9-a3a5-c734-026a-00000000002f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867152.82450: no more pending results, returning what we have 11000 1726867152.82454: results queue empty 11000 1726867152.82455: checking for any_errors_fatal 11000 1726867152.82460: done checking for any_errors_fatal 11000 1726867152.82461: checking for max_fail_percentage 11000 1726867152.82462: done checking for max_fail_percentage 11000 1726867152.82463: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.82464: done checking to see if all hosts have failed 11000 1726867152.82465: getting the remaining hosts for this loop 11000 1726867152.82466: done getting the remaining hosts for this loop 11000 1726867152.82470: getting the next task for host managed_node1 11000 1726867152.82478: done getting next task for host managed_node1 11000 1726867152.82482: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11000 1726867152.82488: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.82504: getting variables 11000 1726867152.82506: in VariableManager get_vars() 11000 1726867152.82544: Calling all_inventory to load vars for managed_node1 11000 1726867152.82546: Calling groups_inventory to load vars for managed_node1 11000 1726867152.82549: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.82560: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.82563: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.82566: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.83488: done sending task result for task 0affcac9-a3a5-c734-026a-00000000002f 11000 1726867152.83492: WORKER PROCESS EXITING 11000 1726867152.84753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867152.94795: done with get_vars() 11000 1726867152.94942: done getting variables 11000 1726867152.95120: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:19:12 -0400 (0:00:00.145) 0:00:14.595 ****** 11000 1726867152.95279: entering _queue_task() for managed_node1/package 11000 1726867152.96160: worker is 1 (out of 1 available) 11000 1726867152.96173: exiting _queue_task() for managed_node1/package 11000 1726867152.96191: done queuing things up, now waiting for results queue to drain 11000 1726867152.96193: waiting for pending results... 11000 1726867152.96703: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11000 1726867152.96935: in run() - task 0affcac9-a3a5-c734-026a-000000000030 11000 1726867152.97012: variable 'ansible_search_path' from source: unknown 11000 1726867152.97018: variable 'ansible_search_path' from source: unknown 11000 1726867152.97091: calling self._execute() 11000 1726867152.97352: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867152.97358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867152.97367: variable 'omit' from source: magic vars 11000 1726867152.98128: variable 'ansible_distribution_major_version' from source: facts 11000 1726867152.98140: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867152.98375: variable 'network_state' from source: role '' defaults 11000 1726867152.98497: Evaluated conditional (network_state != {}): False 11000 1726867152.98500: when evaluation is False, skipping this task 11000 1726867152.98503: _execute() done 11000 1726867152.98506: dumping result to json 11000 1726867152.98508: done dumping result, returning 11000 1726867152.98516: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-c734-026a-000000000030] 11000 1726867152.98521: sending task result for task 0affcac9-a3a5-c734-026a-000000000030 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867152.98741: no more pending results, returning what we have 11000 1726867152.98744: results queue empty 11000 1726867152.98744: checking for any_errors_fatal 11000 1726867152.98751: done checking for any_errors_fatal 11000 1726867152.98751: checking for max_fail_percentage 11000 1726867152.98753: done checking for max_fail_percentage 11000 1726867152.98754: checking to see if all hosts have failed and the running result is not ok 11000 1726867152.98755: done checking to see if all hosts have failed 11000 1726867152.98755: getting the remaining hosts for this loop 11000 1726867152.98757: done getting the remaining hosts for this loop 11000 1726867152.98760: getting the next task for host managed_node1 11000 1726867152.98766: done getting next task for host managed_node1 11000 1726867152.98770: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11000 1726867152.98773: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867152.98793: getting variables 11000 1726867152.98795: in VariableManager get_vars() 11000 1726867152.98945: Calling all_inventory to load vars for managed_node1 11000 1726867152.98948: Calling groups_inventory to load vars for managed_node1 11000 1726867152.98950: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867152.98961: Calling all_plugins_play to load vars for managed_node1 11000 1726867152.98963: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867152.98966: Calling groups_plugins_play to load vars for managed_node1 11000 1726867152.99537: done sending task result for task 0affcac9-a3a5-c734-026a-000000000030 11000 1726867152.99542: WORKER PROCESS EXITING 11000 1726867153.01965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867153.04330: done with get_vars() 11000 1726867153.04352: done getting variables 11000 1726867153.04489: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:19:13 -0400 (0:00:00.092) 0:00:14.688 ****** 11000 1726867153.04530: entering _queue_task() for managed_node1/service 11000 1726867153.04532: Creating lock for service 11000 1726867153.05005: worker is 1 (out of 1 available) 11000 1726867153.05014: exiting _queue_task() for managed_node1/service 11000 1726867153.05025: done queuing things up, now waiting for results queue to drain 11000 1726867153.05026: waiting for pending results... 11000 1726867153.05227: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11000 1726867153.05438: in run() - task 0affcac9-a3a5-c734-026a-000000000031 11000 1726867153.05501: variable 'ansible_search_path' from source: unknown 11000 1726867153.05505: variable 'ansible_search_path' from source: unknown 11000 1726867153.05510: calling self._execute() 11000 1726867153.05583: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867153.05609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867153.05622: variable 'omit' from source: magic vars 11000 1726867153.06202: variable 'ansible_distribution_major_version' from source: facts 11000 1726867153.06345: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867153.06702: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867153.06717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867153.10490: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867153.10564: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867153.10601: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867153.10642: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867153.10668: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867153.10750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.10779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.10804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.10876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.10917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.11018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.11070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.11197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.11243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.11254: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.11535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.11584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.11823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.11883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.11889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.12353: variable 'network_connections' from source: task vars 11000 1726867153.12374: variable 'controller_profile' from source: play vars 11000 1726867153.12562: variable 'controller_profile' from source: play vars 11000 1726867153.12573: variable 'controller_device' from source: play vars 11000 1726867153.12751: variable 'controller_device' from source: play vars 11000 1726867153.12755: variable 'port1_profile' from source: play vars 11000 1726867153.12930: variable 'port1_profile' from source: play vars 11000 1726867153.13006: variable 'dhcp_interface1' from source: play vars 11000 1726867153.13058: variable 'dhcp_interface1' from source: play vars 11000 1726867153.13068: variable 'controller_profile' from source: play vars 11000 1726867153.13131: variable 'controller_profile' from source: play vars 11000 1726867153.13144: variable 'port2_profile' from source: play vars 11000 1726867153.13404: variable 'port2_profile' from source: play vars 11000 1726867153.13408: variable 'dhcp_interface2' from source: play vars 11000 1726867153.13420: variable 'dhcp_interface2' from source: play vars 11000 1726867153.13473: variable 'controller_profile' from source: play vars 11000 1726867153.13703: variable 'controller_profile' from source: play vars 11000 1726867153.13707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867153.14374: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867153.14426: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867153.14503: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867153.14538: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867153.14629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867153.14634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867153.14660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.14687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867153.14834: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867153.15116: variable 'network_connections' from source: task vars 11000 1726867153.15120: variable 'controller_profile' from source: play vars 11000 1726867153.15246: variable 'controller_profile' from source: play vars 11000 1726867153.15249: variable 'controller_device' from source: play vars 11000 1726867153.15287: variable 'controller_device' from source: play vars 11000 1726867153.15305: variable 'port1_profile' from source: play vars 11000 1726867153.15388: variable 'port1_profile' from source: play vars 11000 1726867153.15398: variable 'dhcp_interface1' from source: play vars 11000 1726867153.15463: variable 'dhcp_interface1' from source: play vars 11000 1726867153.15467: variable 'controller_profile' from source: play vars 11000 1726867153.15537: variable 'controller_profile' from source: play vars 11000 1726867153.15572: variable 'port2_profile' from source: play vars 11000 1726867153.15608: variable 'port2_profile' from source: play vars 11000 1726867153.15615: variable 'dhcp_interface2' from source: play vars 11000 1726867153.15682: variable 'dhcp_interface2' from source: play vars 11000 1726867153.15686: variable 'controller_profile' from source: play vars 11000 1726867153.15791: variable 'controller_profile' from source: play vars 11000 1726867153.15794: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11000 1726867153.15797: when evaluation is False, skipping this task 11000 1726867153.15799: _execute() done 11000 1726867153.15801: dumping result to json 11000 1726867153.15803: done dumping result, returning 11000 1726867153.15811: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-000000000031] 11000 1726867153.15824: sending task result for task 0affcac9-a3a5-c734-026a-000000000031 11000 1726867153.16082: done sending task result for task 0affcac9-a3a5-c734-026a-000000000031 11000 1726867153.16085: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11000 1726867153.16131: no more pending results, returning what we have 11000 1726867153.16134: results queue empty 11000 1726867153.16135: checking for any_errors_fatal 11000 1726867153.16151: done checking for any_errors_fatal 11000 1726867153.16152: checking for max_fail_percentage 11000 1726867153.16154: done checking for max_fail_percentage 11000 1726867153.16155: checking to see if all hosts have failed and the running result is not ok 11000 1726867153.16156: done checking to see if all hosts have failed 11000 1726867153.16156: getting the remaining hosts for this loop 11000 1726867153.16158: done getting the remaining hosts for this loop 11000 1726867153.16162: getting the next task for host managed_node1 11000 1726867153.16168: done getting next task for host managed_node1 11000 1726867153.16172: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11000 1726867153.16175: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867153.16193: getting variables 11000 1726867153.16194: in VariableManager get_vars() 11000 1726867153.16235: Calling all_inventory to load vars for managed_node1 11000 1726867153.16238: Calling groups_inventory to load vars for managed_node1 11000 1726867153.16240: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867153.16381: Calling all_plugins_play to load vars for managed_node1 11000 1726867153.16385: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867153.16391: Calling groups_plugins_play to load vars for managed_node1 11000 1726867153.18959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867153.20066: done with get_vars() 11000 1726867153.20083: done getting variables 11000 1726867153.20127: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:19:13 -0400 (0:00:00.156) 0:00:14.844 ****** 11000 1726867153.20149: entering _queue_task() for managed_node1/service 11000 1726867153.20394: worker is 1 (out of 1 available) 11000 1726867153.20406: exiting _queue_task() for managed_node1/service 11000 1726867153.20419: done queuing things up, now waiting for results queue to drain 11000 1726867153.20420: waiting for pending results... 11000 1726867153.20600: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11000 1726867153.20794: in run() - task 0affcac9-a3a5-c734-026a-000000000032 11000 1726867153.20797: variable 'ansible_search_path' from source: unknown 11000 1726867153.20800: variable 'ansible_search_path' from source: unknown 11000 1726867153.20802: calling self._execute() 11000 1726867153.20839: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867153.20843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867153.20853: variable 'omit' from source: magic vars 11000 1726867153.21198: variable 'ansible_distribution_major_version' from source: facts 11000 1726867153.21202: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867153.21373: variable 'network_provider' from source: set_fact 11000 1726867153.21379: variable 'network_state' from source: role '' defaults 11000 1726867153.21382: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11000 1726867153.21385: variable 'omit' from source: magic vars 11000 1726867153.21482: variable 'omit' from source: magic vars 11000 1726867153.21486: variable 'network_service_name' from source: role '' defaults 11000 1726867153.21659: variable 'network_service_name' from source: role '' defaults 11000 1726867153.21663: variable '__network_provider_setup' from source: role '' defaults 11000 1726867153.21666: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867153.21727: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867153.21730: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867153.21789: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867153.22007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867153.23811: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867153.23861: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867153.23889: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867153.23922: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867153.23941: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867153.24000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.24024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.24044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.24070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.24082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.24116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.24136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.24154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.24180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.24193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.24333: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11000 1726867153.24412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.24429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.24446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.24476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.24488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.24547: variable 'ansible_python' from source: facts 11000 1726867153.24564: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11000 1726867153.24623: variable '__network_wpa_supplicant_required' from source: role '' defaults 11000 1726867153.24672: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11000 1726867153.24755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.24772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.24796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.24820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.24832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.24863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867153.24884: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867153.24907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.24931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867153.24941: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867153.25035: variable 'network_connections' from source: task vars 11000 1726867153.25041: variable 'controller_profile' from source: play vars 11000 1726867153.25096: variable 'controller_profile' from source: play vars 11000 1726867153.25107: variable 'controller_device' from source: play vars 11000 1726867153.25158: variable 'controller_device' from source: play vars 11000 1726867153.25168: variable 'port1_profile' from source: play vars 11000 1726867153.25222: variable 'port1_profile' from source: play vars 11000 1726867153.25234: variable 'dhcp_interface1' from source: play vars 11000 1726867153.25282: variable 'dhcp_interface1' from source: play vars 11000 1726867153.25293: variable 'controller_profile' from source: play vars 11000 1726867153.25345: variable 'controller_profile' from source: play vars 11000 1726867153.25354: variable 'port2_profile' from source: play vars 11000 1726867153.25406: variable 'port2_profile' from source: play vars 11000 1726867153.25416: variable 'dhcp_interface2' from source: play vars 11000 1726867153.25469: variable 'dhcp_interface2' from source: play vars 11000 1726867153.25479: variable 'controller_profile' from source: play vars 11000 1726867153.25530: variable 'controller_profile' from source: play vars 11000 1726867153.25606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867153.25736: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867153.25788: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867153.25846: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867153.25942: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867153.25945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867153.25950: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867153.26183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867153.26186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867153.26189: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867153.26336: variable 'network_connections' from source: task vars 11000 1726867153.26341: variable 'controller_profile' from source: play vars 11000 1726867153.26415: variable 'controller_profile' from source: play vars 11000 1726867153.26425: variable 'controller_device' from source: play vars 11000 1726867153.26496: variable 'controller_device' from source: play vars 11000 1726867153.26509: variable 'port1_profile' from source: play vars 11000 1726867153.26576: variable 'port1_profile' from source: play vars 11000 1726867153.26592: variable 'dhcp_interface1' from source: play vars 11000 1726867153.26661: variable 'dhcp_interface1' from source: play vars 11000 1726867153.26671: variable 'controller_profile' from source: play vars 11000 1726867153.26748: variable 'controller_profile' from source: play vars 11000 1726867153.26752: variable 'port2_profile' from source: play vars 11000 1726867153.26822: variable 'port2_profile' from source: play vars 11000 1726867153.26834: variable 'dhcp_interface2' from source: play vars 11000 1726867153.26896: variable 'dhcp_interface2' from source: play vars 11000 1726867153.26905: variable 'controller_profile' from source: play vars 11000 1726867153.26956: variable 'controller_profile' from source: play vars 11000 1726867153.26992: variable '__network_packages_default_wireless' from source: role '' defaults 11000 1726867153.27044: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867153.27235: variable 'network_connections' from source: task vars 11000 1726867153.27238: variable 'controller_profile' from source: play vars 11000 1726867153.27294: variable 'controller_profile' from source: play vars 11000 1726867153.27299: variable 'controller_device' from source: play vars 11000 1726867153.27344: variable 'controller_device' from source: play vars 11000 1726867153.27351: variable 'port1_profile' from source: play vars 11000 1726867153.27402: variable 'port1_profile' from source: play vars 11000 1726867153.27408: variable 'dhcp_interface1' from source: play vars 11000 1726867153.27457: variable 'dhcp_interface1' from source: play vars 11000 1726867153.27461: variable 'controller_profile' from source: play vars 11000 1726867153.27514: variable 'controller_profile' from source: play vars 11000 1726867153.27521: variable 'port2_profile' from source: play vars 11000 1726867153.27567: variable 'port2_profile' from source: play vars 11000 1726867153.27573: variable 'dhcp_interface2' from source: play vars 11000 1726867153.27626: variable 'dhcp_interface2' from source: play vars 11000 1726867153.27630: variable 'controller_profile' from source: play vars 11000 1726867153.27680: variable 'controller_profile' from source: play vars 11000 1726867153.27700: variable '__network_packages_default_team' from source: role '' defaults 11000 1726867153.27755: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867153.27947: variable 'network_connections' from source: task vars 11000 1726867153.27951: variable 'controller_profile' from source: play vars 11000 1726867153.28002: variable 'controller_profile' from source: play vars 11000 1726867153.28008: variable 'controller_device' from source: play vars 11000 1726867153.28057: variable 'controller_device' from source: play vars 11000 1726867153.28065: variable 'port1_profile' from source: play vars 11000 1726867153.28115: variable 'port1_profile' from source: play vars 11000 1726867153.28120: variable 'dhcp_interface1' from source: play vars 11000 1726867153.28170: variable 'dhcp_interface1' from source: play vars 11000 1726867153.28176: variable 'controller_profile' from source: play vars 11000 1726867153.28225: variable 'controller_profile' from source: play vars 11000 1726867153.28231: variable 'port2_profile' from source: play vars 11000 1726867153.28282: variable 'port2_profile' from source: play vars 11000 1726867153.28293: variable 'dhcp_interface2' from source: play vars 11000 1726867153.28336: variable 'dhcp_interface2' from source: play vars 11000 1726867153.28341: variable 'controller_profile' from source: play vars 11000 1726867153.28392: variable 'controller_profile' from source: play vars 11000 1726867153.28436: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867153.28479: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867153.28492: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867153.28529: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867153.28667: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11000 1726867153.28975: variable 'network_connections' from source: task vars 11000 1726867153.28980: variable 'controller_profile' from source: play vars 11000 1726867153.29022: variable 'controller_profile' from source: play vars 11000 1726867153.29027: variable 'controller_device' from source: play vars 11000 1726867153.29071: variable 'controller_device' from source: play vars 11000 1726867153.29080: variable 'port1_profile' from source: play vars 11000 1726867153.29120: variable 'port1_profile' from source: play vars 11000 1726867153.29126: variable 'dhcp_interface1' from source: play vars 11000 1726867153.29169: variable 'dhcp_interface1' from source: play vars 11000 1726867153.29175: variable 'controller_profile' from source: play vars 11000 1726867153.29217: variable 'controller_profile' from source: play vars 11000 1726867153.29223: variable 'port2_profile' from source: play vars 11000 1726867153.29267: variable 'port2_profile' from source: play vars 11000 1726867153.29273: variable 'dhcp_interface2' from source: play vars 11000 1726867153.29316: variable 'dhcp_interface2' from source: play vars 11000 1726867153.29321: variable 'controller_profile' from source: play vars 11000 1726867153.29363: variable 'controller_profile' from source: play vars 11000 1726867153.29366: variable 'ansible_distribution' from source: facts 11000 1726867153.29371: variable '__network_rh_distros' from source: role '' defaults 11000 1726867153.29379: variable 'ansible_distribution_major_version' from source: facts 11000 1726867153.29397: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11000 1726867153.29508: variable 'ansible_distribution' from source: facts 11000 1726867153.29511: variable '__network_rh_distros' from source: role '' defaults 11000 1726867153.29516: variable 'ansible_distribution_major_version' from source: facts 11000 1726867153.29527: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11000 1726867153.29638: variable 'ansible_distribution' from source: facts 11000 1726867153.29641: variable '__network_rh_distros' from source: role '' defaults 11000 1726867153.29645: variable 'ansible_distribution_major_version' from source: facts 11000 1726867153.29671: variable 'network_provider' from source: set_fact 11000 1726867153.29693: variable 'omit' from source: magic vars 11000 1726867153.29713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867153.29732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867153.29747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867153.29760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867153.29767: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867153.29794: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867153.29798: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867153.29801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867153.29864: Set connection var ansible_shell_type to sh 11000 1726867153.29870: Set connection var ansible_pipelining to False 11000 1726867153.29879: Set connection var ansible_shell_executable to /bin/sh 11000 1726867153.29882: Set connection var ansible_connection to ssh 11000 1726867153.29889: Set connection var ansible_timeout to 10 11000 1726867153.29892: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867153.29918: variable 'ansible_shell_executable' from source: unknown 11000 1726867153.29921: variable 'ansible_connection' from source: unknown 11000 1726867153.29924: variable 'ansible_module_compression' from source: unknown 11000 1726867153.29926: variable 'ansible_shell_type' from source: unknown 11000 1726867153.29928: variable 'ansible_shell_executable' from source: unknown 11000 1726867153.29930: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867153.29932: variable 'ansible_pipelining' from source: unknown 11000 1726867153.29934: variable 'ansible_timeout' from source: unknown 11000 1726867153.29936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867153.30009: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867153.30016: variable 'omit' from source: magic vars 11000 1726867153.30025: starting attempt loop 11000 1726867153.30028: running the handler 11000 1726867153.30080: variable 'ansible_facts' from source: unknown 11000 1726867153.30537: _low_level_execute_command(): starting 11000 1726867153.30542: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867153.31037: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867153.31041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867153.31044: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.31088: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867153.31092: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867153.31108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867153.31164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867153.32837: stdout chunk (state=3): >>>/root <<< 11000 1726867153.32928: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867153.32953: stderr chunk (state=3): >>><<< 11000 1726867153.32957: stdout chunk (state=3): >>><<< 11000 1726867153.32980: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867153.32990: _low_level_execute_command(): starting 11000 1726867153.32994: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017 `" && echo ansible-tmp-1726867153.3297613-11684-212239502187017="` echo /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017 `" ) && sleep 0' 11000 1726867153.33414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867153.33417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867153.33420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867153.33422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.33466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867153.33473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867153.33475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867153.33523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867153.35395: stdout chunk (state=3): >>>ansible-tmp-1726867153.3297613-11684-212239502187017=/root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017 <<< 11000 1726867153.35499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867153.35527: stderr chunk (state=3): >>><<< 11000 1726867153.35530: stdout chunk (state=3): >>><<< 11000 1726867153.35545: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867153.3297613-11684-212239502187017=/root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867153.35575: variable 'ansible_module_compression' from source: unknown 11000 1726867153.35621: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11000 1726867153.35625: ANSIBALLZ: Acquiring lock 11000 1726867153.35628: ANSIBALLZ: Lock acquired: 139984830862384 11000 1726867153.35630: ANSIBALLZ: Creating module 11000 1726867153.55191: ANSIBALLZ: Writing module into payload 11000 1726867153.55294: ANSIBALLZ: Writing module 11000 1726867153.55315: ANSIBALLZ: Renaming module 11000 1726867153.55320: ANSIBALLZ: Done creating module 11000 1726867153.55349: variable 'ansible_facts' from source: unknown 11000 1726867153.55479: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/AnsiballZ_systemd.py 11000 1726867153.55581: Sending initial data 11000 1726867153.55584: Sent initial data (156 bytes) 11000 1726867153.56012: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867153.56016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867153.56019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.56063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867153.56066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867153.56128: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867153.57780: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867153.57827: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867153.57870: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpamqa6uk5 /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/AnsiballZ_systemd.py <<< 11000 1726867153.57873: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/AnsiballZ_systemd.py" <<< 11000 1726867153.57946: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpamqa6uk5" to remote "/root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/AnsiballZ_systemd.py" <<< 11000 1726867153.59540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867153.59570: stderr chunk (state=3): >>><<< 11000 1726867153.59614: stdout chunk (state=3): >>><<< 11000 1726867153.59624: done transferring module to remote 11000 1726867153.59647: _low_level_execute_command(): starting 11000 1726867153.59730: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/ /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/AnsiballZ_systemd.py && sleep 0' 11000 1726867153.60330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867153.60344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867153.60358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867153.60374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867153.60394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867153.60405: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867153.60436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.60498: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.60559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867153.60579: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867153.60600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867153.60672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867153.62491: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867153.62508: stdout chunk (state=3): >>><<< 11000 1726867153.62530: stderr chunk (state=3): >>><<< 11000 1726867153.62625: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867153.62629: _low_level_execute_command(): starting 11000 1726867153.62631: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/AnsiballZ_systemd.py && sleep 0' 11000 1726867153.63182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867153.63196: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867153.63241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.63255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867153.63351: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867153.63372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867153.63392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867153.63473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867153.92814: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10022912", "MemoryPeak": "10547200", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300282368", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "380497000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11000 1726867153.92859: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11000 1726867153.94884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867153.94891: stderr chunk (state=3): >>><<< 11000 1726867153.94894: stdout chunk (state=3): >>><<< 11000 1726867153.95402: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10022912", "MemoryPeak": "10547200", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300282368", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "380497000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867153.95601: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867153.95640: _low_level_execute_command(): starting 11000 1726867153.95790: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867153.3297613-11684-212239502187017/ > /dev/null 2>&1 && sleep 0' 11000 1726867153.96783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.96852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867153.97028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867153.97048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867153.97137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867153.99030: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867153.99043: stdout chunk (state=3): >>><<< 11000 1726867153.99058: stderr chunk (state=3): >>><<< 11000 1726867153.99318: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867153.99322: handler run complete 11000 1726867153.99324: attempt loop complete, returning result 11000 1726867153.99326: _execute() done 11000 1726867153.99328: dumping result to json 11000 1726867153.99330: done dumping result, returning 11000 1726867153.99331: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-c734-026a-000000000032] 11000 1726867153.99336: sending task result for task 0affcac9-a3a5-c734-026a-000000000032 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867154.01139: no more pending results, returning what we have 11000 1726867154.01142: results queue empty 11000 1726867154.01143: checking for any_errors_fatal 11000 1726867154.01149: done checking for any_errors_fatal 11000 1726867154.01149: checking for max_fail_percentage 11000 1726867154.01151: done checking for max_fail_percentage 11000 1726867154.01152: checking to see if all hosts have failed and the running result is not ok 11000 1726867154.01153: done checking to see if all hosts have failed 11000 1726867154.01154: getting the remaining hosts for this loop 11000 1726867154.01156: done getting the remaining hosts for this loop 11000 1726867154.01159: getting the next task for host managed_node1 11000 1726867154.01165: done getting next task for host managed_node1 11000 1726867154.01169: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11000 1726867154.01172: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867154.01185: getting variables 11000 1726867154.01308: in VariableManager get_vars() 11000 1726867154.01345: Calling all_inventory to load vars for managed_node1 11000 1726867154.01348: Calling groups_inventory to load vars for managed_node1 11000 1726867154.01350: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867154.01357: done sending task result for task 0affcac9-a3a5-c734-026a-000000000032 11000 1726867154.01360: WORKER PROCESS EXITING 11000 1726867154.01368: Calling all_plugins_play to load vars for managed_node1 11000 1726867154.01371: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867154.01374: Calling groups_plugins_play to load vars for managed_node1 11000 1726867154.03398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867154.05808: done with get_vars() 11000 1726867154.05838: done getting variables 11000 1726867154.05914: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:19:14 -0400 (0:00:00.858) 0:00:15.702 ****** 11000 1726867154.05956: entering _queue_task() for managed_node1/service 11000 1726867154.06326: worker is 1 (out of 1 available) 11000 1726867154.06336: exiting _queue_task() for managed_node1/service 11000 1726867154.06348: done queuing things up, now waiting for results queue to drain 11000 1726867154.06349: waiting for pending results... 11000 1726867154.06629: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11000 1726867154.06767: in run() - task 0affcac9-a3a5-c734-026a-000000000033 11000 1726867154.06796: variable 'ansible_search_path' from source: unknown 11000 1726867154.06806: variable 'ansible_search_path' from source: unknown 11000 1726867154.06845: calling self._execute() 11000 1726867154.06984: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867154.06990: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867154.06993: variable 'omit' from source: magic vars 11000 1726867154.07796: variable 'ansible_distribution_major_version' from source: facts 11000 1726867154.07800: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867154.07803: variable 'network_provider' from source: set_fact 11000 1726867154.07806: Evaluated conditional (network_provider == "nm"): True 11000 1726867154.07958: variable '__network_wpa_supplicant_required' from source: role '' defaults 11000 1726867154.08183: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11000 1726867154.08547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867154.11062: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867154.11123: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867154.11150: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867154.11175: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867154.11202: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867154.11258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867154.11279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867154.11303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867154.11329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867154.11339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867154.11372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867154.11396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867154.11415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867154.11440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867154.11451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867154.11480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867154.11499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867154.11518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867154.11543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867154.11553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867154.11649: variable 'network_connections' from source: task vars 11000 1726867154.11659: variable 'controller_profile' from source: play vars 11000 1726867154.11710: variable 'controller_profile' from source: play vars 11000 1726867154.11718: variable 'controller_device' from source: play vars 11000 1726867154.11762: variable 'controller_device' from source: play vars 11000 1726867154.11770: variable 'port1_profile' from source: play vars 11000 1726867154.11826: variable 'port1_profile' from source: play vars 11000 1726867154.11838: variable 'dhcp_interface1' from source: play vars 11000 1726867154.11876: variable 'dhcp_interface1' from source: play vars 11000 1726867154.11882: variable 'controller_profile' from source: play vars 11000 1726867154.11925: variable 'controller_profile' from source: play vars 11000 1726867154.11931: variable 'port2_profile' from source: play vars 11000 1726867154.11976: variable 'port2_profile' from source: play vars 11000 1726867154.11983: variable 'dhcp_interface2' from source: play vars 11000 1726867154.12026: variable 'dhcp_interface2' from source: play vars 11000 1726867154.12033: variable 'controller_profile' from source: play vars 11000 1726867154.12080: variable 'controller_profile' from source: play vars 11000 1726867154.12129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867154.12252: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867154.12308: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867154.12316: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867154.12348: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867154.12400: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867154.12425: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867154.12445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867154.12686: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867154.12689: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867154.12755: variable 'network_connections' from source: task vars 11000 1726867154.12760: variable 'controller_profile' from source: play vars 11000 1726867154.12837: variable 'controller_profile' from source: play vars 11000 1726867154.12862: variable 'controller_device' from source: play vars 11000 1726867154.12912: variable 'controller_device' from source: play vars 11000 1726867154.12939: variable 'port1_profile' from source: play vars 11000 1726867154.13054: variable 'port1_profile' from source: play vars 11000 1726867154.13057: variable 'dhcp_interface1' from source: play vars 11000 1726867154.13059: variable 'dhcp_interface1' from source: play vars 11000 1726867154.13061: variable 'controller_profile' from source: play vars 11000 1726867154.13120: variable 'controller_profile' from source: play vars 11000 1726867154.13126: variable 'port2_profile' from source: play vars 11000 1726867154.13180: variable 'port2_profile' from source: play vars 11000 1726867154.13186: variable 'dhcp_interface2' from source: play vars 11000 1726867154.13248: variable 'dhcp_interface2' from source: play vars 11000 1726867154.13265: variable 'controller_profile' from source: play vars 11000 1726867154.13326: variable 'controller_profile' from source: play vars 11000 1726867154.13382: Evaluated conditional (__network_wpa_supplicant_required): False 11000 1726867154.13386: when evaluation is False, skipping this task 11000 1726867154.13390: _execute() done 11000 1726867154.13392: dumping result to json 11000 1726867154.13397: done dumping result, returning 11000 1726867154.13447: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-c734-026a-000000000033] 11000 1726867154.13450: sending task result for task 0affcac9-a3a5-c734-026a-000000000033 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11000 1726867154.13668: no more pending results, returning what we have 11000 1726867154.13671: results queue empty 11000 1726867154.13674: checking for any_errors_fatal 11000 1726867154.13699: done checking for any_errors_fatal 11000 1726867154.13700: checking for max_fail_percentage 11000 1726867154.13701: done checking for max_fail_percentage 11000 1726867154.13702: checking to see if all hosts have failed and the running result is not ok 11000 1726867154.13703: done checking to see if all hosts have failed 11000 1726867154.13704: getting the remaining hosts for this loop 11000 1726867154.13705: done getting the remaining hosts for this loop 11000 1726867154.13708: getting the next task for host managed_node1 11000 1726867154.13713: done getting next task for host managed_node1 11000 1726867154.13716: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11000 1726867154.13719: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867154.13732: getting variables 11000 1726867154.13733: in VariableManager get_vars() 11000 1726867154.13772: Calling all_inventory to load vars for managed_node1 11000 1726867154.13778: Calling groups_inventory to load vars for managed_node1 11000 1726867154.13781: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867154.13789: Calling all_plugins_play to load vars for managed_node1 11000 1726867154.13791: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867154.13794: Calling groups_plugins_play to load vars for managed_node1 11000 1726867154.14344: done sending task result for task 0affcac9-a3a5-c734-026a-000000000033 11000 1726867154.14351: WORKER PROCESS EXITING 11000 1726867154.15166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867154.16525: done with get_vars() 11000 1726867154.16546: done getting variables 11000 1726867154.16617: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:19:14 -0400 (0:00:00.106) 0:00:15.809 ****** 11000 1726867154.16655: entering _queue_task() for managed_node1/service 11000 1726867154.16937: worker is 1 (out of 1 available) 11000 1726867154.16950: exiting _queue_task() for managed_node1/service 11000 1726867154.16962: done queuing things up, now waiting for results queue to drain 11000 1726867154.16963: waiting for pending results... 11000 1726867154.17195: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11000 1726867154.17328: in run() - task 0affcac9-a3a5-c734-026a-000000000034 11000 1726867154.17335: variable 'ansible_search_path' from source: unknown 11000 1726867154.17339: variable 'ansible_search_path' from source: unknown 11000 1726867154.17370: calling self._execute() 11000 1726867154.17483: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867154.17487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867154.17493: variable 'omit' from source: magic vars 11000 1726867154.17904: variable 'ansible_distribution_major_version' from source: facts 11000 1726867154.17913: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867154.18007: variable 'network_provider' from source: set_fact 11000 1726867154.18011: Evaluated conditional (network_provider == "initscripts"): False 11000 1726867154.18014: when evaluation is False, skipping this task 11000 1726867154.18017: _execute() done 11000 1726867154.18019: dumping result to json 11000 1726867154.18021: done dumping result, returning 11000 1726867154.18029: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-c734-026a-000000000034] 11000 1726867154.18034: sending task result for task 0affcac9-a3a5-c734-026a-000000000034 11000 1726867154.18121: done sending task result for task 0affcac9-a3a5-c734-026a-000000000034 11000 1726867154.18125: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867154.18198: no more pending results, returning what we have 11000 1726867154.18201: results queue empty 11000 1726867154.18202: checking for any_errors_fatal 11000 1726867154.18208: done checking for any_errors_fatal 11000 1726867154.18209: checking for max_fail_percentage 11000 1726867154.18211: done checking for max_fail_percentage 11000 1726867154.18211: checking to see if all hosts have failed and the running result is not ok 11000 1726867154.18212: done checking to see if all hosts have failed 11000 1726867154.18213: getting the remaining hosts for this loop 11000 1726867154.18214: done getting the remaining hosts for this loop 11000 1726867154.18217: getting the next task for host managed_node1 11000 1726867154.18222: done getting next task for host managed_node1 11000 1726867154.18225: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11000 1726867154.18228: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867154.18245: getting variables 11000 1726867154.18247: in VariableManager get_vars() 11000 1726867154.18281: Calling all_inventory to load vars for managed_node1 11000 1726867154.18283: Calling groups_inventory to load vars for managed_node1 11000 1726867154.18285: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867154.18293: Calling all_plugins_play to load vars for managed_node1 11000 1726867154.18295: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867154.18298: Calling groups_plugins_play to load vars for managed_node1 11000 1726867154.19380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867154.20599: done with get_vars() 11000 1726867154.20613: done getting variables 11000 1726867154.20653: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:19:14 -0400 (0:00:00.040) 0:00:15.849 ****** 11000 1726867154.20675: entering _queue_task() for managed_node1/copy 11000 1726867154.20913: worker is 1 (out of 1 available) 11000 1726867154.20925: exiting _queue_task() for managed_node1/copy 11000 1726867154.20935: done queuing things up, now waiting for results queue to drain 11000 1726867154.20936: waiting for pending results... 11000 1726867154.21360: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11000 1726867154.21365: in run() - task 0affcac9-a3a5-c734-026a-000000000035 11000 1726867154.21368: variable 'ansible_search_path' from source: unknown 11000 1726867154.21371: variable 'ansible_search_path' from source: unknown 11000 1726867154.21376: calling self._execute() 11000 1726867154.21464: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867154.21468: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867154.21512: variable 'omit' from source: magic vars 11000 1726867154.21857: variable 'ansible_distribution_major_version' from source: facts 11000 1726867154.21867: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867154.21976: variable 'network_provider' from source: set_fact 11000 1726867154.21984: Evaluated conditional (network_provider == "initscripts"): False 11000 1726867154.21987: when evaluation is False, skipping this task 11000 1726867154.21989: _execute() done 11000 1726867154.21992: dumping result to json 11000 1726867154.22054: done dumping result, returning 11000 1726867154.22057: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-c734-026a-000000000035] 11000 1726867154.22060: sending task result for task 0affcac9-a3a5-c734-026a-000000000035 11000 1726867154.22127: done sending task result for task 0affcac9-a3a5-c734-026a-000000000035 11000 1726867154.22129: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11000 1726867154.22196: no more pending results, returning what we have 11000 1726867154.22199: results queue empty 11000 1726867154.22200: checking for any_errors_fatal 11000 1726867154.22203: done checking for any_errors_fatal 11000 1726867154.22204: checking for max_fail_percentage 11000 1726867154.22205: done checking for max_fail_percentage 11000 1726867154.22206: checking to see if all hosts have failed and the running result is not ok 11000 1726867154.22207: done checking to see if all hosts have failed 11000 1726867154.22207: getting the remaining hosts for this loop 11000 1726867154.22209: done getting the remaining hosts for this loop 11000 1726867154.22212: getting the next task for host managed_node1 11000 1726867154.22217: done getting next task for host managed_node1 11000 1726867154.22220: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11000 1726867154.22222: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867154.22234: getting variables 11000 1726867154.22236: in VariableManager get_vars() 11000 1726867154.22269: Calling all_inventory to load vars for managed_node1 11000 1726867154.22272: Calling groups_inventory to load vars for managed_node1 11000 1726867154.22274: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867154.22285: Calling all_plugins_play to load vars for managed_node1 11000 1726867154.22290: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867154.22292: Calling groups_plugins_play to load vars for managed_node1 11000 1726867154.23417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867154.24282: done with get_vars() 11000 1726867154.24298: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:19:14 -0400 (0:00:00.036) 0:00:15.886 ****** 11000 1726867154.24355: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11000 1726867154.24357: Creating lock for fedora.linux_system_roles.network_connections 11000 1726867154.24556: worker is 1 (out of 1 available) 11000 1726867154.24567: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11000 1726867154.24581: done queuing things up, now waiting for results queue to drain 11000 1726867154.24582: waiting for pending results... 11000 1726867154.24745: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11000 1726867154.24819: in run() - task 0affcac9-a3a5-c734-026a-000000000036 11000 1726867154.24830: variable 'ansible_search_path' from source: unknown 11000 1726867154.24834: variable 'ansible_search_path' from source: unknown 11000 1726867154.24860: calling self._execute() 11000 1726867154.24931: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867154.24936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867154.24944: variable 'omit' from source: magic vars 11000 1726867154.25205: variable 'ansible_distribution_major_version' from source: facts 11000 1726867154.25215: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867154.25220: variable 'omit' from source: magic vars 11000 1726867154.25260: variable 'omit' from source: magic vars 11000 1726867154.25368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867154.27102: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867154.27158: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867154.27193: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867154.27220: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867154.27240: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867154.27301: variable 'network_provider' from source: set_fact 11000 1726867154.27408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867154.27731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867154.27747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867154.27779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867154.27793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867154.27846: variable 'omit' from source: magic vars 11000 1726867154.27937: variable 'omit' from source: magic vars 11000 1726867154.28038: variable 'network_connections' from source: task vars 11000 1726867154.28047: variable 'controller_profile' from source: play vars 11000 1726867154.28113: variable 'controller_profile' from source: play vars 11000 1726867154.28126: variable 'controller_device' from source: play vars 11000 1726867154.28180: variable 'controller_device' from source: play vars 11000 1726867154.28201: variable 'port1_profile' from source: play vars 11000 1726867154.28246: variable 'port1_profile' from source: play vars 11000 1726867154.28252: variable 'dhcp_interface1' from source: play vars 11000 1726867154.28297: variable 'dhcp_interface1' from source: play vars 11000 1726867154.28302: variable 'controller_profile' from source: play vars 11000 1726867154.28346: variable 'controller_profile' from source: play vars 11000 1726867154.28352: variable 'port2_profile' from source: play vars 11000 1726867154.28397: variable 'port2_profile' from source: play vars 11000 1726867154.28403: variable 'dhcp_interface2' from source: play vars 11000 1726867154.28482: variable 'dhcp_interface2' from source: play vars 11000 1726867154.28485: variable 'controller_profile' from source: play vars 11000 1726867154.28538: variable 'controller_profile' from source: play vars 11000 1726867154.28659: variable 'omit' from source: magic vars 11000 1726867154.28669: variable '__lsr_ansible_managed' from source: task vars 11000 1726867154.28712: variable '__lsr_ansible_managed' from source: task vars 11000 1726867154.28871: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11000 1726867154.29027: Loaded config def from plugin (lookup/template) 11000 1726867154.29031: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11000 1726867154.29050: File lookup term: get_ansible_managed.j2 11000 1726867154.29053: variable 'ansible_search_path' from source: unknown 11000 1726867154.29057: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11000 1726867154.29067: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11000 1726867154.29082: variable 'ansible_search_path' from source: unknown 11000 1726867154.32355: variable 'ansible_managed' from source: unknown 11000 1726867154.32428: variable 'omit' from source: magic vars 11000 1726867154.32448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867154.32467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867154.32482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867154.32497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867154.32505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867154.32541: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867154.32544: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867154.32546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867154.32608: Set connection var ansible_shell_type to sh 11000 1726867154.32615: Set connection var ansible_pipelining to False 11000 1726867154.32622: Set connection var ansible_shell_executable to /bin/sh 11000 1726867154.32624: Set connection var ansible_connection to ssh 11000 1726867154.32629: Set connection var ansible_timeout to 10 11000 1726867154.32641: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867154.32657: variable 'ansible_shell_executable' from source: unknown 11000 1726867154.32660: variable 'ansible_connection' from source: unknown 11000 1726867154.32663: variable 'ansible_module_compression' from source: unknown 11000 1726867154.32665: variable 'ansible_shell_type' from source: unknown 11000 1726867154.32667: variable 'ansible_shell_executable' from source: unknown 11000 1726867154.32670: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867154.32674: variable 'ansible_pipelining' from source: unknown 11000 1726867154.32678: variable 'ansible_timeout' from source: unknown 11000 1726867154.32683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867154.32766: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867154.32774: variable 'omit' from source: magic vars 11000 1726867154.32781: starting attempt loop 11000 1726867154.32784: running the handler 11000 1726867154.32798: _low_level_execute_command(): starting 11000 1726867154.32804: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867154.33308: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867154.33315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.33318: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867154.33320: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.33368: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867154.33371: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867154.33386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867154.33459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867154.35112: stdout chunk (state=3): >>>/root <<< 11000 1726867154.35213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867154.35256: stderr chunk (state=3): >>><<< 11000 1726867154.35260: stdout chunk (state=3): >>><<< 11000 1726867154.35281: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867154.35295: _low_level_execute_command(): starting 11000 1726867154.35299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596 `" && echo ansible-tmp-1726867154.3527632-11744-29128809415596="` echo /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596 `" ) && sleep 0' 11000 1726867154.35804: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867154.35809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867154.35812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.35814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867154.35816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.35868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867154.35872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867154.35949: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867154.37783: stdout chunk (state=3): >>>ansible-tmp-1726867154.3527632-11744-29128809415596=/root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596 <<< 11000 1726867154.37907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867154.37924: stderr chunk (state=3): >>><<< 11000 1726867154.37927: stdout chunk (state=3): >>><<< 11000 1726867154.37943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867154.3527632-11744-29128809415596=/root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867154.38009: variable 'ansible_module_compression' from source: unknown 11000 1726867154.38055: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11000 1726867154.38058: ANSIBALLZ: Acquiring lock 11000 1726867154.38060: ANSIBALLZ: Lock acquired: 139984830985328 11000 1726867154.38062: ANSIBALLZ: Creating module 11000 1726867154.53195: ANSIBALLZ: Writing module into payload 11000 1726867154.53458: ANSIBALLZ: Writing module 11000 1726867154.53478: ANSIBALLZ: Renaming module 11000 1726867154.53484: ANSIBALLZ: Done creating module 11000 1726867154.53508: variable 'ansible_facts' from source: unknown 11000 1726867154.53581: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/AnsiballZ_network_connections.py 11000 1726867154.53705: Sending initial data 11000 1726867154.53708: Sent initial data (167 bytes) 11000 1726867154.54232: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867154.54236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.54255: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.54307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867154.54310: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867154.54318: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867154.54370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867154.55987: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867154.56026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867154.56084: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpqmo0oza4 /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/AnsiballZ_network_connections.py <<< 11000 1726867154.56088: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/AnsiballZ_network_connections.py" <<< 11000 1726867154.56115: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpqmo0oza4" to remote "/root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/AnsiballZ_network_connections.py" <<< 11000 1726867154.56121: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/AnsiballZ_network_connections.py" <<< 11000 1726867154.57150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867154.57153: stdout chunk (state=3): >>><<< 11000 1726867154.57156: stderr chunk (state=3): >>><<< 11000 1726867154.57157: done transferring module to remote 11000 1726867154.57159: _low_level_execute_command(): starting 11000 1726867154.57161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/ /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/AnsiballZ_network_connections.py && sleep 0' 11000 1726867154.57709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867154.57722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867154.57735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.57800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867154.57806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867154.57867: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867154.59607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867154.59635: stderr chunk (state=3): >>><<< 11000 1726867154.59639: stdout chunk (state=3): >>><<< 11000 1726867154.59654: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867154.59656: _low_level_execute_command(): starting 11000 1726867154.59662: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/AnsiballZ_network_connections.py && sleep 0' 11000 1726867154.60400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867154.60438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867154.60455: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867154.60467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867154.60611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867155.03304: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11000 1726867155.05044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867155.05051: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 11000 1726867155.05113: stderr chunk (state=3): >>><<< 11000 1726867155.05122: stdout chunk (state=3): >>><<< 11000 1726867155.05145: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867155.05217: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'deprecated-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'master': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'master': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867155.05283: _low_level_execute_command(): starting 11000 1726867155.05289: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867154.3527632-11744-29128809415596/ > /dev/null 2>&1 && sleep 0' 11000 1726867155.05892: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867155.05900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867155.05966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867155.06018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867155.06030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867155.06049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867155.06124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867155.08298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867155.08301: stdout chunk (state=3): >>><<< 11000 1726867155.08303: stderr chunk (state=3): >>><<< 11000 1726867155.08306: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867155.08308: handler run complete 11000 1726867155.08310: attempt loop complete, returning result 11000 1726867155.08312: _execute() done 11000 1726867155.08314: dumping result to json 11000 1726867155.08316: done dumping result, returning 11000 1726867155.08318: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-c734-026a-000000000036] 11000 1726867155.08320: sending task result for task 0affcac9-a3a5-c734-026a-000000000036 11000 1726867155.08408: done sending task result for task 0affcac9-a3a5-c734-026a-000000000036 11000 1726867155.08412: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da (not-active) 11000 1726867155.08552: no more pending results, returning what we have 11000 1726867155.08556: results queue empty 11000 1726867155.08560: checking for any_errors_fatal 11000 1726867155.08565: done checking for any_errors_fatal 11000 1726867155.08566: checking for max_fail_percentage 11000 1726867155.08567: done checking for max_fail_percentage 11000 1726867155.08568: checking to see if all hosts have failed and the running result is not ok 11000 1726867155.08569: done checking to see if all hosts have failed 11000 1726867155.08570: getting the remaining hosts for this loop 11000 1726867155.08571: done getting the remaining hosts for this loop 11000 1726867155.08681: getting the next task for host managed_node1 11000 1726867155.08688: done getting next task for host managed_node1 11000 1726867155.08793: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11000 1726867155.08797: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867155.08832: getting variables 11000 1726867155.08834: in VariableManager get_vars() 11000 1726867155.09088: Calling all_inventory to load vars for managed_node1 11000 1726867155.09091: Calling groups_inventory to load vars for managed_node1 11000 1726867155.09093: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867155.09108: Calling all_plugins_play to load vars for managed_node1 11000 1726867155.09142: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867155.09146: Calling groups_plugins_play to load vars for managed_node1 11000 1726867155.12159: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867155.14012: done with get_vars() 11000 1726867155.14034: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:19:15 -0400 (0:00:00.897) 0:00:16.784 ****** 11000 1726867155.14135: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11000 1726867155.14137: Creating lock for fedora.linux_system_roles.network_state 11000 1726867155.14716: worker is 1 (out of 1 available) 11000 1726867155.14727: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11000 1726867155.14738: done queuing things up, now waiting for results queue to drain 11000 1726867155.14739: waiting for pending results... 11000 1726867155.15312: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11000 1726867155.15316: in run() - task 0affcac9-a3a5-c734-026a-000000000037 11000 1726867155.15319: variable 'ansible_search_path' from source: unknown 11000 1726867155.15322: variable 'ansible_search_path' from source: unknown 11000 1726867155.15324: calling self._execute() 11000 1726867155.15406: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.15422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.15436: variable 'omit' from source: magic vars 11000 1726867155.15866: variable 'ansible_distribution_major_version' from source: facts 11000 1726867155.15885: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867155.16019: variable 'network_state' from source: role '' defaults 11000 1726867155.16035: Evaluated conditional (network_state != {}): False 11000 1726867155.16043: when evaluation is False, skipping this task 11000 1726867155.16167: _execute() done 11000 1726867155.16173: dumping result to json 11000 1726867155.16176: done dumping result, returning 11000 1726867155.16181: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-c734-026a-000000000037] 11000 1726867155.16183: sending task result for task 0affcac9-a3a5-c734-026a-000000000037 11000 1726867155.16251: done sending task result for task 0affcac9-a3a5-c734-026a-000000000037 11000 1726867155.16255: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867155.16311: no more pending results, returning what we have 11000 1726867155.16315: results queue empty 11000 1726867155.16316: checking for any_errors_fatal 11000 1726867155.16328: done checking for any_errors_fatal 11000 1726867155.16329: checking for max_fail_percentage 11000 1726867155.16331: done checking for max_fail_percentage 11000 1726867155.16336: checking to see if all hosts have failed and the running result is not ok 11000 1726867155.16337: done checking to see if all hosts have failed 11000 1726867155.16340: getting the remaining hosts for this loop 11000 1726867155.16341: done getting the remaining hosts for this loop 11000 1726867155.16345: getting the next task for host managed_node1 11000 1726867155.16351: done getting next task for host managed_node1 11000 1726867155.16355: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11000 1726867155.16358: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867155.16374: getting variables 11000 1726867155.16375: in VariableManager get_vars() 11000 1726867155.16539: Calling all_inventory to load vars for managed_node1 11000 1726867155.16542: Calling groups_inventory to load vars for managed_node1 11000 1726867155.16545: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867155.16557: Calling all_plugins_play to load vars for managed_node1 11000 1726867155.16561: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867155.16564: Calling groups_plugins_play to load vars for managed_node1 11000 1726867155.18799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867155.21417: done with get_vars() 11000 1726867155.21445: done getting variables 11000 1726867155.21515: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:19:15 -0400 (0:00:00.074) 0:00:16.858 ****** 11000 1726867155.21550: entering _queue_task() for managed_node1/debug 11000 1726867155.22332: worker is 1 (out of 1 available) 11000 1726867155.22345: exiting _queue_task() for managed_node1/debug 11000 1726867155.22358: done queuing things up, now waiting for results queue to drain 11000 1726867155.22359: waiting for pending results... 11000 1726867155.22797: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11000 1726867155.22856: in run() - task 0affcac9-a3a5-c734-026a-000000000038 11000 1726867155.22874: variable 'ansible_search_path' from source: unknown 11000 1726867155.22881: variable 'ansible_search_path' from source: unknown 11000 1726867155.22932: calling self._execute() 11000 1726867155.23048: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.23053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.23142: variable 'omit' from source: magic vars 11000 1726867155.24064: variable 'ansible_distribution_major_version' from source: facts 11000 1726867155.24158: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867155.24165: variable 'omit' from source: magic vars 11000 1726867155.24235: variable 'omit' from source: magic vars 11000 1726867155.24313: variable 'omit' from source: magic vars 11000 1726867155.24358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867155.24401: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867155.24420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867155.24441: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867155.24453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867155.24562: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867155.24565: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.24568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.24595: Set connection var ansible_shell_type to sh 11000 1726867155.24604: Set connection var ansible_pipelining to False 11000 1726867155.24612: Set connection var ansible_shell_executable to /bin/sh 11000 1726867155.24615: Set connection var ansible_connection to ssh 11000 1726867155.24620: Set connection var ansible_timeout to 10 11000 1726867155.24626: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867155.24751: variable 'ansible_shell_executable' from source: unknown 11000 1726867155.24755: variable 'ansible_connection' from source: unknown 11000 1726867155.24758: variable 'ansible_module_compression' from source: unknown 11000 1726867155.24760: variable 'ansible_shell_type' from source: unknown 11000 1726867155.24763: variable 'ansible_shell_executable' from source: unknown 11000 1726867155.24765: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.24769: variable 'ansible_pipelining' from source: unknown 11000 1726867155.24781: variable 'ansible_timeout' from source: unknown 11000 1726867155.24784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.25183: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867155.25190: variable 'omit' from source: magic vars 11000 1726867155.25192: starting attempt loop 11000 1726867155.25195: running the handler 11000 1726867155.25696: variable '__network_connections_result' from source: set_fact 11000 1726867155.25699: handler run complete 11000 1726867155.25702: attempt loop complete, returning result 11000 1726867155.25703: _execute() done 11000 1726867155.25706: dumping result to json 11000 1726867155.25708: done dumping result, returning 11000 1726867155.25915: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-c734-026a-000000000038] 11000 1726867155.25918: sending task result for task 0affcac9-a3a5-c734-026a-000000000038 11000 1726867155.25989: done sending task result for task 0affcac9-a3a5-c734-026a-000000000038 11000 1726867155.25992: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da (not-active)" ] } 11000 1726867155.26093: no more pending results, returning what we have 11000 1726867155.26096: results queue empty 11000 1726867155.26097: checking for any_errors_fatal 11000 1726867155.26103: done checking for any_errors_fatal 11000 1726867155.26104: checking for max_fail_percentage 11000 1726867155.26105: done checking for max_fail_percentage 11000 1726867155.26106: checking to see if all hosts have failed and the running result is not ok 11000 1726867155.26107: done checking to see if all hosts have failed 11000 1726867155.26108: getting the remaining hosts for this loop 11000 1726867155.26110: done getting the remaining hosts for this loop 11000 1726867155.26114: getting the next task for host managed_node1 11000 1726867155.26120: done getting next task for host managed_node1 11000 1726867155.26183: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11000 1726867155.26188: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867155.26201: getting variables 11000 1726867155.26202: in VariableManager get_vars() 11000 1726867155.26354: Calling all_inventory to load vars for managed_node1 11000 1726867155.26358: Calling groups_inventory to load vars for managed_node1 11000 1726867155.26360: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867155.26369: Calling all_plugins_play to load vars for managed_node1 11000 1726867155.26371: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867155.26374: Calling groups_plugins_play to load vars for managed_node1 11000 1726867155.28073: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867155.30189: done with get_vars() 11000 1726867155.30216: done getting variables 11000 1726867155.30276: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:19:15 -0400 (0:00:00.091) 0:00:16.950 ****** 11000 1726867155.30721: entering _queue_task() for managed_node1/debug 11000 1726867155.31344: worker is 1 (out of 1 available) 11000 1726867155.31355: exiting _queue_task() for managed_node1/debug 11000 1726867155.31367: done queuing things up, now waiting for results queue to drain 11000 1726867155.31369: waiting for pending results... 11000 1726867155.32000: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11000 1726867155.32341: in run() - task 0affcac9-a3a5-c734-026a-000000000039 11000 1726867155.32347: variable 'ansible_search_path' from source: unknown 11000 1726867155.32350: variable 'ansible_search_path' from source: unknown 11000 1726867155.32354: calling self._execute() 11000 1726867155.32437: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.32668: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.32672: variable 'omit' from source: magic vars 11000 1726867155.33537: variable 'ansible_distribution_major_version' from source: facts 11000 1726867155.33541: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867155.33544: variable 'omit' from source: magic vars 11000 1726867155.33547: variable 'omit' from source: magic vars 11000 1726867155.33601: variable 'omit' from source: magic vars 11000 1726867155.33758: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867155.33804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867155.33830: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867155.33883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867155.33989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867155.34025: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867155.34034: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.34043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.34260: Set connection var ansible_shell_type to sh 11000 1726867155.34274: Set connection var ansible_pipelining to False 11000 1726867155.34296: Set connection var ansible_shell_executable to /bin/sh 11000 1726867155.34483: Set connection var ansible_connection to ssh 11000 1726867155.34488: Set connection var ansible_timeout to 10 11000 1726867155.34491: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867155.34493: variable 'ansible_shell_executable' from source: unknown 11000 1726867155.34495: variable 'ansible_connection' from source: unknown 11000 1726867155.34497: variable 'ansible_module_compression' from source: unknown 11000 1726867155.34499: variable 'ansible_shell_type' from source: unknown 11000 1726867155.34501: variable 'ansible_shell_executable' from source: unknown 11000 1726867155.34503: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.34505: variable 'ansible_pipelining' from source: unknown 11000 1726867155.34507: variable 'ansible_timeout' from source: unknown 11000 1726867155.34509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.34988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867155.34993: variable 'omit' from source: magic vars 11000 1726867155.34996: starting attempt loop 11000 1726867155.34998: running the handler 11000 1726867155.35000: variable '__network_connections_result' from source: set_fact 11000 1726867155.35106: variable '__network_connections_result' from source: set_fact 11000 1726867155.35476: handler run complete 11000 1726867155.35516: attempt loop complete, returning result 11000 1726867155.35590: _execute() done 11000 1726867155.35599: dumping result to json 11000 1726867155.35610: done dumping result, returning 11000 1726867155.35623: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-c734-026a-000000000039] 11000 1726867155.35639: sending task result for task 0affcac9-a3a5-c734-026a-000000000039 11000 1726867155.36102: done sending task result for task 0affcac9-a3a5-c734-026a-000000000039 11000 1726867155.36106: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, a61dfef3-6218-4c4f-ba0f-002676378e96 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f31f4939-dccd-4694-8e6c-e832cbfb865b (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, a97880b0-dde0-4b51-ba2e-4449038703da (not-active)" ] } } 11000 1726867155.36205: no more pending results, returning what we have 11000 1726867155.36208: results queue empty 11000 1726867155.36214: checking for any_errors_fatal 11000 1726867155.36219: done checking for any_errors_fatal 11000 1726867155.36220: checking for max_fail_percentage 11000 1726867155.36221: done checking for max_fail_percentage 11000 1726867155.36222: checking to see if all hosts have failed and the running result is not ok 11000 1726867155.36223: done checking to see if all hosts have failed 11000 1726867155.36223: getting the remaining hosts for this loop 11000 1726867155.36225: done getting the remaining hosts for this loop 11000 1726867155.36228: getting the next task for host managed_node1 11000 1726867155.36233: done getting next task for host managed_node1 11000 1726867155.36236: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11000 1726867155.36239: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867155.36248: getting variables 11000 1726867155.36249: in VariableManager get_vars() 11000 1726867155.36286: Calling all_inventory to load vars for managed_node1 11000 1726867155.36289: Calling groups_inventory to load vars for managed_node1 11000 1726867155.36291: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867155.36299: Calling all_plugins_play to load vars for managed_node1 11000 1726867155.36302: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867155.36305: Calling groups_plugins_play to load vars for managed_node1 11000 1726867155.39351: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867155.41508: done with get_vars() 11000 1726867155.41531: done getting variables 11000 1726867155.41604: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:19:15 -0400 (0:00:00.109) 0:00:17.059 ****** 11000 1726867155.41642: entering _queue_task() for managed_node1/debug 11000 1726867155.42014: worker is 1 (out of 1 available) 11000 1726867155.42028: exiting _queue_task() for managed_node1/debug 11000 1726867155.42192: done queuing things up, now waiting for results queue to drain 11000 1726867155.42193: waiting for pending results... 11000 1726867155.42342: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11000 1726867155.42470: in run() - task 0affcac9-a3a5-c734-026a-00000000003a 11000 1726867155.42490: variable 'ansible_search_path' from source: unknown 11000 1726867155.42494: variable 'ansible_search_path' from source: unknown 11000 1726867155.42538: calling self._execute() 11000 1726867155.42646: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.42651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.42653: variable 'omit' from source: magic vars 11000 1726867155.43036: variable 'ansible_distribution_major_version' from source: facts 11000 1726867155.43048: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867155.43294: variable 'network_state' from source: role '' defaults 11000 1726867155.43297: Evaluated conditional (network_state != {}): False 11000 1726867155.43299: when evaluation is False, skipping this task 11000 1726867155.43301: _execute() done 11000 1726867155.43303: dumping result to json 11000 1726867155.43305: done dumping result, returning 11000 1726867155.43307: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-c734-026a-00000000003a] 11000 1726867155.43309: sending task result for task 0affcac9-a3a5-c734-026a-00000000003a 11000 1726867155.43371: done sending task result for task 0affcac9-a3a5-c734-026a-00000000003a 11000 1726867155.43374: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11000 1726867155.43446: no more pending results, returning what we have 11000 1726867155.43450: results queue empty 11000 1726867155.43450: checking for any_errors_fatal 11000 1726867155.43465: done checking for any_errors_fatal 11000 1726867155.43466: checking for max_fail_percentage 11000 1726867155.43468: done checking for max_fail_percentage 11000 1726867155.43469: checking to see if all hosts have failed and the running result is not ok 11000 1726867155.43470: done checking to see if all hosts have failed 11000 1726867155.43471: getting the remaining hosts for this loop 11000 1726867155.43472: done getting the remaining hosts for this loop 11000 1726867155.43476: getting the next task for host managed_node1 11000 1726867155.43516: done getting next task for host managed_node1 11000 1726867155.43521: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11000 1726867155.43524: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867155.43540: getting variables 11000 1726867155.43542: in VariableManager get_vars() 11000 1726867155.43736: Calling all_inventory to load vars for managed_node1 11000 1726867155.43740: Calling groups_inventory to load vars for managed_node1 11000 1726867155.43742: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867155.43751: Calling all_plugins_play to load vars for managed_node1 11000 1726867155.43754: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867155.43757: Calling groups_plugins_play to load vars for managed_node1 11000 1726867155.45473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867155.47801: done with get_vars() 11000 1726867155.47828: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:19:15 -0400 (0:00:00.062) 0:00:17.122 ****** 11000 1726867155.47931: entering _queue_task() for managed_node1/ping 11000 1726867155.47933: Creating lock for ping 11000 1726867155.48380: worker is 1 (out of 1 available) 11000 1726867155.48393: exiting _queue_task() for managed_node1/ping 11000 1726867155.48403: done queuing things up, now waiting for results queue to drain 11000 1726867155.48405: waiting for pending results... 11000 1726867155.48574: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11000 1726867155.48718: in run() - task 0affcac9-a3a5-c734-026a-00000000003b 11000 1726867155.48742: variable 'ansible_search_path' from source: unknown 11000 1726867155.48749: variable 'ansible_search_path' from source: unknown 11000 1726867155.48792: calling self._execute() 11000 1726867155.48898: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.49020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.49023: variable 'omit' from source: magic vars 11000 1726867155.49308: variable 'ansible_distribution_major_version' from source: facts 11000 1726867155.49325: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867155.49336: variable 'omit' from source: magic vars 11000 1726867155.49404: variable 'omit' from source: magic vars 11000 1726867155.49443: variable 'omit' from source: magic vars 11000 1726867155.49497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867155.49535: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867155.49566: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867155.49594: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867155.49609: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867155.49641: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867155.49675: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.49680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.49760: Set connection var ansible_shell_type to sh 11000 1726867155.49774: Set connection var ansible_pipelining to False 11000 1726867155.49884: Set connection var ansible_shell_executable to /bin/sh 11000 1726867155.49891: Set connection var ansible_connection to ssh 11000 1726867155.49893: Set connection var ansible_timeout to 10 11000 1726867155.49896: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867155.49897: variable 'ansible_shell_executable' from source: unknown 11000 1726867155.49900: variable 'ansible_connection' from source: unknown 11000 1726867155.49902: variable 'ansible_module_compression' from source: unknown 11000 1726867155.49904: variable 'ansible_shell_type' from source: unknown 11000 1726867155.49906: variable 'ansible_shell_executable' from source: unknown 11000 1726867155.49908: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867155.49910: variable 'ansible_pipelining' from source: unknown 11000 1726867155.49912: variable 'ansible_timeout' from source: unknown 11000 1726867155.49914: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867155.50140: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867155.50145: variable 'omit' from source: magic vars 11000 1726867155.50147: starting attempt loop 11000 1726867155.50150: running the handler 11000 1726867155.50152: _low_level_execute_command(): starting 11000 1726867155.50154: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867155.50896: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867155.50983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867155.51002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867155.51047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867155.51066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867155.51094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867155.51247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867155.52913: stdout chunk (state=3): >>>/root <<< 11000 1726867155.53097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867155.53100: stdout chunk (state=3): >>><<< 11000 1726867155.53102: stderr chunk (state=3): >>><<< 11000 1726867155.53309: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867155.53314: _low_level_execute_command(): starting 11000 1726867155.53318: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624 `" && echo ansible-tmp-1726867155.5321858-11787-47708819556624="` echo /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624 `" ) && sleep 0' 11000 1726867155.54568: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867155.54571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867155.54574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867155.54578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867155.54581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867155.54593: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867155.54596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867155.54598: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867155.54601: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867155.54769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867155.54829: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867155.56845: stdout chunk (state=3): >>>ansible-tmp-1726867155.5321858-11787-47708819556624=/root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624 <<< 11000 1726867155.56979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867155.56983: stdout chunk (state=3): >>><<< 11000 1726867155.56990: stderr chunk (state=3): >>><<< 11000 1726867155.57019: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867155.5321858-11787-47708819556624=/root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867155.57069: variable 'ansible_module_compression' from source: unknown 11000 1726867155.57156: ANSIBALLZ: Using lock for ping 11000 1726867155.57159: ANSIBALLZ: Acquiring lock 11000 1726867155.57162: ANSIBALLZ: Lock acquired: 139984828910240 11000 1726867155.57164: ANSIBALLZ: Creating module 11000 1726867155.81948: ANSIBALLZ: Writing module into payload 11000 1726867155.82025: ANSIBALLZ: Writing module 11000 1726867155.82049: ANSIBALLZ: Renaming module 11000 1726867155.82083: ANSIBALLZ: Done creating module 11000 1726867155.82095: variable 'ansible_facts' from source: unknown 11000 1726867155.82258: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/AnsiballZ_ping.py 11000 1726867155.82403: Sending initial data 11000 1726867155.82612: Sent initial data (152 bytes) 11000 1726867155.83172: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867155.83293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867155.83583: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867155.83861: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867155.85496: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11000 1726867155.85515: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867155.85584: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867155.85625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpbm8185p4 /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/AnsiballZ_ping.py <<< 11000 1726867155.85646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/AnsiballZ_ping.py" <<< 11000 1726867155.85685: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpbm8185p4" to remote "/root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/AnsiballZ_ping.py" <<< 11000 1726867155.86468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867155.86476: stdout chunk (state=3): >>><<< 11000 1726867155.86491: stderr chunk (state=3): >>><<< 11000 1726867155.86534: done transferring module to remote 11000 1726867155.86552: _low_level_execute_command(): starting 11000 1726867155.86561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/ /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/AnsiballZ_ping.py && sleep 0' 11000 1726867155.87592: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867155.87608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867155.87622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867155.87646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867155.87673: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867155.87692: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867155.87710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867155.87740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867155.87837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867155.87848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867155.88284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867155.88338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867155.90157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867155.90160: stdout chunk (state=3): >>><<< 11000 1726867155.90163: stderr chunk (state=3): >>><<< 11000 1726867155.90262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867155.90266: _low_level_execute_command(): starting 11000 1726867155.90268: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/AnsiballZ_ping.py && sleep 0' 11000 1726867155.91170: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867155.91179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867155.91195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867155.91274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.06387: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11000 1726867156.07927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867156.07931: stdout chunk (state=3): >>><<< 11000 1726867156.07933: stderr chunk (state=3): >>><<< 11000 1726867156.07935: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867156.07939: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867156.07941: _low_level_execute_command(): starting 11000 1726867156.07943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867155.5321858-11787-47708819556624/ > /dev/null 2>&1 && sleep 0' 11000 1726867156.09149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867156.09154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.09157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867156.09181: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867156.09261: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.11147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867156.11159: stderr chunk (state=3): >>><<< 11000 1726867156.11166: stdout chunk (state=3): >>><<< 11000 1726867156.11191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867156.11384: handler run complete 11000 1726867156.11387: attempt loop complete, returning result 11000 1726867156.11390: _execute() done 11000 1726867156.11392: dumping result to json 11000 1726867156.11394: done dumping result, returning 11000 1726867156.11396: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-c734-026a-00000000003b] 11000 1726867156.11399: sending task result for task 0affcac9-a3a5-c734-026a-00000000003b 11000 1726867156.11467: done sending task result for task 0affcac9-a3a5-c734-026a-00000000003b 11000 1726867156.11471: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 11000 1726867156.11530: no more pending results, returning what we have 11000 1726867156.11534: results queue empty 11000 1726867156.11534: checking for any_errors_fatal 11000 1726867156.11540: done checking for any_errors_fatal 11000 1726867156.11540: checking for max_fail_percentage 11000 1726867156.11542: done checking for max_fail_percentage 11000 1726867156.11542: checking to see if all hosts have failed and the running result is not ok 11000 1726867156.11543: done checking to see if all hosts have failed 11000 1726867156.11544: getting the remaining hosts for this loop 11000 1726867156.11545: done getting the remaining hosts for this loop 11000 1726867156.11548: getting the next task for host managed_node1 11000 1726867156.11594: done getting next task for host managed_node1 11000 1726867156.11596: ^ task is: TASK: meta (role_complete) 11000 1726867156.11599: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867156.11610: getting variables 11000 1726867156.11612: in VariableManager get_vars() 11000 1726867156.11648: Calling all_inventory to load vars for managed_node1 11000 1726867156.11650: Calling groups_inventory to load vars for managed_node1 11000 1726867156.11652: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.11660: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.11715: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.11722: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.12648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.13522: done with get_vars() 11000 1726867156.13543: done getting variables 11000 1726867156.13634: done queuing things up, now waiting for results queue to drain 11000 1726867156.13636: results queue empty 11000 1726867156.13637: checking for any_errors_fatal 11000 1726867156.13639: done checking for any_errors_fatal 11000 1726867156.13640: checking for max_fail_percentage 11000 1726867156.13640: done checking for max_fail_percentage 11000 1726867156.13641: checking to see if all hosts have failed and the running result is not ok 11000 1726867156.13642: done checking to see if all hosts have failed 11000 1726867156.13643: getting the remaining hosts for this loop 11000 1726867156.13643: done getting the remaining hosts for this loop 11000 1726867156.13646: getting the next task for host managed_node1 11000 1726867156.13650: done getting next task for host managed_node1 11000 1726867156.13653: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11000 1726867156.13655: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867156.13657: getting variables 11000 1726867156.13658: in VariableManager get_vars() 11000 1726867156.13671: Calling all_inventory to load vars for managed_node1 11000 1726867156.13673: Calling groups_inventory to load vars for managed_node1 11000 1726867156.13675: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.13681: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.13684: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.13686: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.14624: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.15474: done with get_vars() 11000 1726867156.15490: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 17:19:16 -0400 (0:00:00.676) 0:00:17.798 ****** 11000 1726867156.15542: entering _queue_task() for managed_node1/include_tasks 11000 1726867156.15758: worker is 1 (out of 1 available) 11000 1726867156.15771: exiting _queue_task() for managed_node1/include_tasks 11000 1726867156.15785: done queuing things up, now waiting for results queue to drain 11000 1726867156.15786: waiting for pending results... 11000 1726867156.15951: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11000 1726867156.16036: in run() - task 0affcac9-a3a5-c734-026a-00000000006e 11000 1726867156.16047: variable 'ansible_search_path' from source: unknown 11000 1726867156.16050: variable 'ansible_search_path' from source: unknown 11000 1726867156.16076: calling self._execute() 11000 1726867156.16153: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.16157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.16165: variable 'omit' from source: magic vars 11000 1726867156.16512: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.16515: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.16518: _execute() done 11000 1726867156.16521: dumping result to json 11000 1726867156.16524: done dumping result, returning 11000 1726867156.16526: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [0affcac9-a3a5-c734-026a-00000000006e] 11000 1726867156.16528: sending task result for task 0affcac9-a3a5-c734-026a-00000000006e 11000 1726867156.16738: done sending task result for task 0affcac9-a3a5-c734-026a-00000000006e 11000 1726867156.16740: WORKER PROCESS EXITING 11000 1726867156.16763: no more pending results, returning what we have 11000 1726867156.16766: in VariableManager get_vars() 11000 1726867156.16807: Calling all_inventory to load vars for managed_node1 11000 1726867156.16812: Calling groups_inventory to load vars for managed_node1 11000 1726867156.16814: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.16823: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.16825: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.16828: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.18272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.20206: done with get_vars() 11000 1726867156.20225: variable 'ansible_search_path' from source: unknown 11000 1726867156.20226: variable 'ansible_search_path' from source: unknown 11000 1726867156.20274: we have included files to process 11000 1726867156.20275: generating all_blocks data 11000 1726867156.20279: done generating all_blocks data 11000 1726867156.20284: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867156.20285: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867156.20287: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11000 1726867156.20482: done processing included file 11000 1726867156.20485: iterating over new_blocks loaded from include file 11000 1726867156.20486: in VariableManager get_vars() 11000 1726867156.20505: done with get_vars() 11000 1726867156.20507: filtering new block on tags 11000 1726867156.20524: done filtering new block on tags 11000 1726867156.20526: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11000 1726867156.20531: extending task lists for all hosts with included blocks 11000 1726867156.20642: done extending task lists 11000 1726867156.20644: done processing included files 11000 1726867156.20644: results queue empty 11000 1726867156.20645: checking for any_errors_fatal 11000 1726867156.20646: done checking for any_errors_fatal 11000 1726867156.20647: checking for max_fail_percentage 11000 1726867156.20648: done checking for max_fail_percentage 11000 1726867156.20649: checking to see if all hosts have failed and the running result is not ok 11000 1726867156.20649: done checking to see if all hosts have failed 11000 1726867156.20650: getting the remaining hosts for this loop 11000 1726867156.20651: done getting the remaining hosts for this loop 11000 1726867156.20654: getting the next task for host managed_node1 11000 1726867156.20658: done getting next task for host managed_node1 11000 1726867156.20660: ^ task is: TASK: Get stat for interface {{ interface }} 11000 1726867156.20662: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867156.20665: getting variables 11000 1726867156.20666: in VariableManager get_vars() 11000 1726867156.20681: Calling all_inventory to load vars for managed_node1 11000 1726867156.20683: Calling groups_inventory to load vars for managed_node1 11000 1726867156.20687: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.20701: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.20703: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.20706: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.22022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.23705: done with get_vars() 11000 1726867156.23728: done getting variables 11000 1726867156.23924: variable 'interface' from source: task vars 11000 1726867156.23928: variable 'controller_device' from source: play vars 11000 1726867156.23999: variable 'controller_device' from source: play vars TASK [Get stat for interface deprecated-bond] ********************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 17:19:16 -0400 (0:00:00.084) 0:00:17.883 ****** 11000 1726867156.24035: entering _queue_task() for managed_node1/stat 11000 1726867156.24691: worker is 1 (out of 1 available) 11000 1726867156.24702: exiting _queue_task() for managed_node1/stat 11000 1726867156.24711: done queuing things up, now waiting for results queue to drain 11000 1726867156.24712: waiting for pending results... 11000 1726867156.24956: running TaskExecutor() for managed_node1/TASK: Get stat for interface deprecated-bond 11000 1726867156.24961: in run() - task 0affcac9-a3a5-c734-026a-000000000242 11000 1726867156.24981: variable 'ansible_search_path' from source: unknown 11000 1726867156.24988: variable 'ansible_search_path' from source: unknown 11000 1726867156.25025: calling self._execute() 11000 1726867156.25130: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.25142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.25169: variable 'omit' from source: magic vars 11000 1726867156.25546: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.25564: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.25574: variable 'omit' from source: magic vars 11000 1726867156.25653: variable 'omit' from source: magic vars 11000 1726867156.25774: variable 'interface' from source: task vars 11000 1726867156.25787: variable 'controller_device' from source: play vars 11000 1726867156.25926: variable 'controller_device' from source: play vars 11000 1726867156.25929: variable 'omit' from source: magic vars 11000 1726867156.25941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867156.25985: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867156.26009: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867156.26043: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867156.26058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867156.26095: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867156.26103: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.26111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.26220: Set connection var ansible_shell_type to sh 11000 1726867156.26234: Set connection var ansible_pipelining to False 11000 1726867156.26356: Set connection var ansible_shell_executable to /bin/sh 11000 1726867156.26360: Set connection var ansible_connection to ssh 11000 1726867156.26362: Set connection var ansible_timeout to 10 11000 1726867156.26365: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867156.26367: variable 'ansible_shell_executable' from source: unknown 11000 1726867156.26369: variable 'ansible_connection' from source: unknown 11000 1726867156.26371: variable 'ansible_module_compression' from source: unknown 11000 1726867156.26373: variable 'ansible_shell_type' from source: unknown 11000 1726867156.26375: variable 'ansible_shell_executable' from source: unknown 11000 1726867156.26379: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.26381: variable 'ansible_pipelining' from source: unknown 11000 1726867156.26383: variable 'ansible_timeout' from source: unknown 11000 1726867156.26385: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.26614: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867156.26619: variable 'omit' from source: magic vars 11000 1726867156.26622: starting attempt loop 11000 1726867156.26624: running the handler 11000 1726867156.26683: _low_level_execute_command(): starting 11000 1726867156.26686: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867156.27461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.27528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867156.27560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867156.27635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.29391: stdout chunk (state=3): >>>/root <<< 11000 1726867156.29716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867156.29720: stdout chunk (state=3): >>><<< 11000 1726867156.29724: stderr chunk (state=3): >>><<< 11000 1726867156.29726: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867156.29729: _low_level_execute_command(): starting 11000 1726867156.29732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891 `" && echo ansible-tmp-1726867156.296219-11833-912127174891="` echo /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891 `" ) && sleep 0' 11000 1726867156.30705: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867156.30981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867156.31002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867156.31076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.32960: stdout chunk (state=3): >>>ansible-tmp-1726867156.296219-11833-912127174891=/root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891 <<< 11000 1726867156.33288: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867156.33292: stdout chunk (state=3): >>><<< 11000 1726867156.33294: stderr chunk (state=3): >>><<< 11000 1726867156.33296: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867156.296219-11833-912127174891=/root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867156.33298: variable 'ansible_module_compression' from source: unknown 11000 1726867156.33300: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11000 1726867156.33302: variable 'ansible_facts' from source: unknown 11000 1726867156.33384: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/AnsiballZ_stat.py 11000 1726867156.33605: Sending initial data 11000 1726867156.33608: Sent initial data (149 bytes) 11000 1726867156.34648: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867156.34767: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867156.34893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.36414: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11000 1726867156.36426: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11000 1726867156.36436: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11000 1726867156.36445: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11000 1726867156.36454: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11000 1726867156.36461: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 11000 1726867156.36472: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11000 1726867156.36495: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867156.36550: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867156.36625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmplyns653m /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/AnsiballZ_stat.py <<< 11000 1726867156.36634: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/AnsiballZ_stat.py" <<< 11000 1726867156.36660: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmplyns653m" to remote "/root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/AnsiballZ_stat.py" <<< 11000 1726867156.37417: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867156.37451: stderr chunk (state=3): >>><<< 11000 1726867156.37460: stdout chunk (state=3): >>><<< 11000 1726867156.37602: done transferring module to remote 11000 1726867156.37609: _low_level_execute_command(): starting 11000 1726867156.37612: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/ /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/AnsiballZ_stat.py && sleep 0' 11000 1726867156.38095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867156.38099: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867156.38102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867156.38105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.38107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867156.38109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867156.38118: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.38154: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867156.38167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867156.38222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.39967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867156.39991: stderr chunk (state=3): >>><<< 11000 1726867156.39995: stdout chunk (state=3): >>><<< 11000 1726867156.40005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867156.40008: _low_level_execute_command(): starting 11000 1726867156.40013: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/AnsiballZ_stat.py && sleep 0' 11000 1726867156.40421: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867156.40424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.40426: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867156.40428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867156.40433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.40467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867156.40473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867156.40547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.55628: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27280, "dev": 23, "nlink": 1, "atime": 1726867154.899742, "mtime": 1726867154.899742, "ctime": 1726867154.899742, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11000 1726867156.56898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867156.56924: stderr chunk (state=3): >>><<< 11000 1726867156.56928: stdout chunk (state=3): >>><<< 11000 1726867156.56946: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27280, "dev": 23, "nlink": 1, "atime": 1726867154.899742, "mtime": 1726867154.899742, "ctime": 1726867154.899742, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867156.56991: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867156.56995: _low_level_execute_command(): starting 11000 1726867156.57001: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867156.296219-11833-912127174891/ > /dev/null 2>&1 && sleep 0' 11000 1726867156.57457: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867156.57460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867156.57467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.57469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867156.57471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867156.57473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867156.57525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867156.57531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867156.57573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867156.59372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867156.59399: stderr chunk (state=3): >>><<< 11000 1726867156.59402: stdout chunk (state=3): >>><<< 11000 1726867156.59414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867156.59423: handler run complete 11000 1726867156.59454: attempt loop complete, returning result 11000 1726867156.59457: _execute() done 11000 1726867156.59460: dumping result to json 11000 1726867156.59465: done dumping result, returning 11000 1726867156.59472: done running TaskExecutor() for managed_node1/TASK: Get stat for interface deprecated-bond [0affcac9-a3a5-c734-026a-000000000242] 11000 1726867156.59478: sending task result for task 0affcac9-a3a5-c734-026a-000000000242 11000 1726867156.59578: done sending task result for task 0affcac9-a3a5-c734-026a-000000000242 11000 1726867156.59582: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726867154.899742, "block_size": 4096, "blocks": 0, "ctime": 1726867154.899742, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27280, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "mode": "0777", "mtime": 1726867154.899742, "nlink": 1, "path": "/sys/class/net/deprecated-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11000 1726867156.59664: no more pending results, returning what we have 11000 1726867156.59667: results queue empty 11000 1726867156.59668: checking for any_errors_fatal 11000 1726867156.59669: done checking for any_errors_fatal 11000 1726867156.59669: checking for max_fail_percentage 11000 1726867156.59671: done checking for max_fail_percentage 11000 1726867156.59672: checking to see if all hosts have failed and the running result is not ok 11000 1726867156.59672: done checking to see if all hosts have failed 11000 1726867156.59673: getting the remaining hosts for this loop 11000 1726867156.59674: done getting the remaining hosts for this loop 11000 1726867156.59680: getting the next task for host managed_node1 11000 1726867156.59688: done getting next task for host managed_node1 11000 1726867156.59692: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11000 1726867156.59694: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867156.59699: getting variables 11000 1726867156.59701: in VariableManager get_vars() 11000 1726867156.59739: Calling all_inventory to load vars for managed_node1 11000 1726867156.59741: Calling groups_inventory to load vars for managed_node1 11000 1726867156.59743: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.59753: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.59755: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.59758: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.60585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.61468: done with get_vars() 11000 1726867156.61485: done getting variables 11000 1726867156.61530: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867156.61614: variable 'interface' from source: task vars 11000 1726867156.61618: variable 'controller_device' from source: play vars 11000 1726867156.61661: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'deprecated-bond'] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 17:19:16 -0400 (0:00:00.376) 0:00:18.259 ****** 11000 1726867156.61686: entering _queue_task() for managed_node1/assert 11000 1726867156.61905: worker is 1 (out of 1 available) 11000 1726867156.61920: exiting _queue_task() for managed_node1/assert 11000 1726867156.61933: done queuing things up, now waiting for results queue to drain 11000 1726867156.61934: waiting for pending results... 11000 1726867156.62299: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'deprecated-bond' 11000 1726867156.62304: in run() - task 0affcac9-a3a5-c734-026a-00000000006f 11000 1726867156.62307: variable 'ansible_search_path' from source: unknown 11000 1726867156.62311: variable 'ansible_search_path' from source: unknown 11000 1726867156.62396: calling self._execute() 11000 1726867156.62649: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.62654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.62664: variable 'omit' from source: magic vars 11000 1726867156.63183: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.63199: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.63205: variable 'omit' from source: magic vars 11000 1726867156.63259: variable 'omit' from source: magic vars 11000 1726867156.63363: variable 'interface' from source: task vars 11000 1726867156.63367: variable 'controller_device' from source: play vars 11000 1726867156.63472: variable 'controller_device' from source: play vars 11000 1726867156.63496: variable 'omit' from source: magic vars 11000 1726867156.63532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867156.63579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867156.63606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867156.63642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867156.63645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867156.63668: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867156.63672: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.63674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.63745: Set connection var ansible_shell_type to sh 11000 1726867156.63751: Set connection var ansible_pipelining to False 11000 1726867156.63758: Set connection var ansible_shell_executable to /bin/sh 11000 1726867156.63761: Set connection var ansible_connection to ssh 11000 1726867156.63766: Set connection var ansible_timeout to 10 11000 1726867156.63781: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867156.63823: variable 'ansible_shell_executable' from source: unknown 11000 1726867156.63826: variable 'ansible_connection' from source: unknown 11000 1726867156.63829: variable 'ansible_module_compression' from source: unknown 11000 1726867156.63831: variable 'ansible_shell_type' from source: unknown 11000 1726867156.63833: variable 'ansible_shell_executable' from source: unknown 11000 1726867156.63835: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.63840: variable 'ansible_pipelining' from source: unknown 11000 1726867156.63842: variable 'ansible_timeout' from source: unknown 11000 1726867156.63845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.63947: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867156.63956: variable 'omit' from source: magic vars 11000 1726867156.63962: starting attempt loop 11000 1726867156.63965: running the handler 11000 1726867156.64056: variable 'interface_stat' from source: set_fact 11000 1726867156.64076: Evaluated conditional (interface_stat.stat.exists): True 11000 1726867156.64081: handler run complete 11000 1726867156.64096: attempt loop complete, returning result 11000 1726867156.64099: _execute() done 11000 1726867156.64101: dumping result to json 11000 1726867156.64104: done dumping result, returning 11000 1726867156.64110: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'deprecated-bond' [0affcac9-a3a5-c734-026a-00000000006f] 11000 1726867156.64112: sending task result for task 0affcac9-a3a5-c734-026a-00000000006f 11000 1726867156.64189: done sending task result for task 0affcac9-a3a5-c734-026a-00000000006f 11000 1726867156.64192: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867156.64262: no more pending results, returning what we have 11000 1726867156.64265: results queue empty 11000 1726867156.64266: checking for any_errors_fatal 11000 1726867156.64272: done checking for any_errors_fatal 11000 1726867156.64273: checking for max_fail_percentage 11000 1726867156.64274: done checking for max_fail_percentage 11000 1726867156.64275: checking to see if all hosts have failed and the running result is not ok 11000 1726867156.64276: done checking to see if all hosts have failed 11000 1726867156.64276: getting the remaining hosts for this loop 11000 1726867156.64279: done getting the remaining hosts for this loop 11000 1726867156.64282: getting the next task for host managed_node1 11000 1726867156.64288: done getting next task for host managed_node1 11000 1726867156.64291: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11000 1726867156.64292: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867156.64295: getting variables 11000 1726867156.64297: in VariableManager get_vars() 11000 1726867156.64330: Calling all_inventory to load vars for managed_node1 11000 1726867156.64332: Calling groups_inventory to load vars for managed_node1 11000 1726867156.64334: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.64343: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.64345: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.64347: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.65130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.66602: done with get_vars() 11000 1726867156.66621: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:67 Friday 20 September 2024 17:19:16 -0400 (0:00:00.050) 0:00:18.310 ****** 11000 1726867156.66701: entering _queue_task() for managed_node1/include_tasks 11000 1726867156.66985: worker is 1 (out of 1 available) 11000 1726867156.66996: exiting _queue_task() for managed_node1/include_tasks 11000 1726867156.67009: done queuing things up, now waiting for results queue to drain 11000 1726867156.67010: waiting for pending results... 11000 1726867156.67282: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 11000 1726867156.67342: in run() - task 0affcac9-a3a5-c734-026a-000000000070 11000 1726867156.67353: variable 'ansible_search_path' from source: unknown 11000 1726867156.67394: variable 'controller_profile' from source: play vars 11000 1726867156.67576: variable 'controller_profile' from source: play vars 11000 1726867156.67621: variable 'port1_profile' from source: play vars 11000 1726867156.67645: variable 'port1_profile' from source: play vars 11000 1726867156.67651: variable 'port2_profile' from source: play vars 11000 1726867156.67700: variable 'port2_profile' from source: play vars 11000 1726867156.67710: variable 'omit' from source: magic vars 11000 1726867156.67812: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.67819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.67830: variable 'omit' from source: magic vars 11000 1726867156.67993: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.68001: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.68021: variable 'item' from source: unknown 11000 1726867156.68068: variable 'item' from source: unknown 11000 1726867156.68184: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.68190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.68193: variable 'omit' from source: magic vars 11000 1726867156.68268: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.68272: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.68294: variable 'item' from source: unknown 11000 1726867156.68338: variable 'item' from source: unknown 11000 1726867156.68403: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.68406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.68418: variable 'omit' from source: magic vars 11000 1726867156.68513: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.68517: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.68538: variable 'item' from source: unknown 11000 1726867156.68581: variable 'item' from source: unknown 11000 1726867156.68647: dumping result to json 11000 1726867156.68649: done dumping result, returning 11000 1726867156.68652: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [0affcac9-a3a5-c734-026a-000000000070] 11000 1726867156.68654: sending task result for task 0affcac9-a3a5-c734-026a-000000000070 11000 1726867156.68690: done sending task result for task 0affcac9-a3a5-c734-026a-000000000070 11000 1726867156.68692: WORKER PROCESS EXITING 11000 1726867156.68718: no more pending results, returning what we have 11000 1726867156.68722: in VariableManager get_vars() 11000 1726867156.68764: Calling all_inventory to load vars for managed_node1 11000 1726867156.68767: Calling groups_inventory to load vars for managed_node1 11000 1726867156.68769: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.68784: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.68789: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.68792: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.70542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.72917: done with get_vars() 11000 1726867156.72938: variable 'ansible_search_path' from source: unknown 11000 1726867156.72954: variable 'ansible_search_path' from source: unknown 11000 1726867156.72963: variable 'ansible_search_path' from source: unknown 11000 1726867156.72969: we have included files to process 11000 1726867156.72970: generating all_blocks data 11000 1726867156.72972: done generating all_blocks data 11000 1726867156.72975: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.72978: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.72981: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.73168: in VariableManager get_vars() 11000 1726867156.73194: done with get_vars() 11000 1726867156.73448: done processing included file 11000 1726867156.73450: iterating over new_blocks loaded from include file 11000 1726867156.73451: in VariableManager get_vars() 11000 1726867156.73468: done with get_vars() 11000 1726867156.73470: filtering new block on tags 11000 1726867156.73493: done filtering new block on tags 11000 1726867156.73496: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0) 11000 1726867156.73501: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.73502: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.73505: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.73603: in VariableManager get_vars() 11000 1726867156.73623: done with get_vars() 11000 1726867156.73832: done processing included file 11000 1726867156.73834: iterating over new_blocks loaded from include file 11000 1726867156.73835: in VariableManager get_vars() 11000 1726867156.73852: done with get_vars() 11000 1726867156.73854: filtering new block on tags 11000 1726867156.73871: done filtering new block on tags 11000 1726867156.73873: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.0) 11000 1726867156.73876: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.73879: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.73882: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11000 1726867156.74322: in VariableManager get_vars() 11000 1726867156.74390: done with get_vars() 11000 1726867156.74700: done processing included file 11000 1726867156.74702: iterating over new_blocks loaded from include file 11000 1726867156.74703: in VariableManager get_vars() 11000 1726867156.74720: done with get_vars() 11000 1726867156.74722: filtering new block on tags 11000 1726867156.74737: done filtering new block on tags 11000 1726867156.74739: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.1) 11000 1726867156.74742: extending task lists for all hosts with included blocks 11000 1726867156.78300: done extending task lists 11000 1726867156.78306: done processing included files 11000 1726867156.78307: results queue empty 11000 1726867156.78308: checking for any_errors_fatal 11000 1726867156.78312: done checking for any_errors_fatal 11000 1726867156.78313: checking for max_fail_percentage 11000 1726867156.78314: done checking for max_fail_percentage 11000 1726867156.78314: checking to see if all hosts have failed and the running result is not ok 11000 1726867156.78315: done checking to see if all hosts have failed 11000 1726867156.78316: getting the remaining hosts for this loop 11000 1726867156.78317: done getting the remaining hosts for this loop 11000 1726867156.78319: getting the next task for host managed_node1 11000 1726867156.78324: done getting next task for host managed_node1 11000 1726867156.78326: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11000 1726867156.78328: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867156.78331: getting variables 11000 1726867156.78332: in VariableManager get_vars() 11000 1726867156.78345: Calling all_inventory to load vars for managed_node1 11000 1726867156.78347: Calling groups_inventory to load vars for managed_node1 11000 1726867156.78349: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.78354: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.78356: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.78359: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.85539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.87863: done with get_vars() 11000 1726867156.88188: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:19:16 -0400 (0:00:00.215) 0:00:18.525 ****** 11000 1726867156.88258: entering _queue_task() for managed_node1/include_tasks 11000 1726867156.88819: worker is 1 (out of 1 available) 11000 1726867156.88830: exiting _queue_task() for managed_node1/include_tasks 11000 1726867156.88841: done queuing things up, now waiting for results queue to drain 11000 1726867156.88842: waiting for pending results... 11000 1726867156.89326: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11000 1726867156.89422: in run() - task 0affcac9-a3a5-c734-026a-000000000260 11000 1726867156.89441: variable 'ansible_search_path' from source: unknown 11000 1726867156.89450: variable 'ansible_search_path' from source: unknown 11000 1726867156.89516: calling self._execute() 11000 1726867156.89939: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.89942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.89945: variable 'omit' from source: magic vars 11000 1726867156.90562: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.90627: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.90694: _execute() done 11000 1726867156.90782: dumping result to json 11000 1726867156.90786: done dumping result, returning 11000 1726867156.90791: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-c734-026a-000000000260] 11000 1726867156.90793: sending task result for task 0affcac9-a3a5-c734-026a-000000000260 11000 1726867156.90869: done sending task result for task 0affcac9-a3a5-c734-026a-000000000260 11000 1726867156.90872: WORKER PROCESS EXITING 11000 1726867156.90905: no more pending results, returning what we have 11000 1726867156.90910: in VariableManager get_vars() 11000 1726867156.90958: Calling all_inventory to load vars for managed_node1 11000 1726867156.90960: Calling groups_inventory to load vars for managed_node1 11000 1726867156.90963: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.90976: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.90980: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.90983: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.92300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.94193: done with get_vars() 11000 1726867156.94210: variable 'ansible_search_path' from source: unknown 11000 1726867156.94212: variable 'ansible_search_path' from source: unknown 11000 1726867156.94246: we have included files to process 11000 1726867156.94248: generating all_blocks data 11000 1726867156.94250: done generating all_blocks data 11000 1726867156.94251: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867156.94252: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867156.94255: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867156.95667: done processing included file 11000 1726867156.95669: iterating over new_blocks loaded from include file 11000 1726867156.95670: in VariableManager get_vars() 11000 1726867156.95693: done with get_vars() 11000 1726867156.95695: filtering new block on tags 11000 1726867156.95716: done filtering new block on tags 11000 1726867156.95719: in VariableManager get_vars() 11000 1726867156.95744: done with get_vars() 11000 1726867156.95747: filtering new block on tags 11000 1726867156.95773: done filtering new block on tags 11000 1726867156.95775: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11000 1726867156.95782: extending task lists for all hosts with included blocks 11000 1726867156.96040: done extending task lists 11000 1726867156.96041: done processing included files 11000 1726867156.96042: results queue empty 11000 1726867156.96043: checking for any_errors_fatal 11000 1726867156.96046: done checking for any_errors_fatal 11000 1726867156.96047: checking for max_fail_percentage 11000 1726867156.96048: done checking for max_fail_percentage 11000 1726867156.96049: checking to see if all hosts have failed and the running result is not ok 11000 1726867156.96050: done checking to see if all hosts have failed 11000 1726867156.96051: getting the remaining hosts for this loop 11000 1726867156.96052: done getting the remaining hosts for this loop 11000 1726867156.96054: getting the next task for host managed_node1 11000 1726867156.96058: done getting next task for host managed_node1 11000 1726867156.96061: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11000 1726867156.96063: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867156.96066: getting variables 11000 1726867156.96067: in VariableManager get_vars() 11000 1726867156.96082: Calling all_inventory to load vars for managed_node1 11000 1726867156.96084: Calling groups_inventory to load vars for managed_node1 11000 1726867156.96087: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867156.96092: Calling all_plugins_play to load vars for managed_node1 11000 1726867156.96094: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867156.96097: Calling groups_plugins_play to load vars for managed_node1 11000 1726867156.97354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867156.98341: done with get_vars() 11000 1726867156.98354: done getting variables 11000 1726867156.98381: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:19:16 -0400 (0:00:00.101) 0:00:18.627 ****** 11000 1726867156.98406: entering _queue_task() for managed_node1/set_fact 11000 1726867156.98634: worker is 1 (out of 1 available) 11000 1726867156.98644: exiting _queue_task() for managed_node1/set_fact 11000 1726867156.98655: done queuing things up, now waiting for results queue to drain 11000 1726867156.98656: waiting for pending results... 11000 1726867156.98841: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11000 1726867156.99093: in run() - task 0affcac9-a3a5-c734-026a-0000000003b3 11000 1726867156.99097: variable 'ansible_search_path' from source: unknown 11000 1726867156.99100: variable 'ansible_search_path' from source: unknown 11000 1726867156.99102: calling self._execute() 11000 1726867156.99116: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.99122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.99132: variable 'omit' from source: magic vars 11000 1726867156.99492: variable 'ansible_distribution_major_version' from source: facts 11000 1726867156.99502: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867156.99508: variable 'omit' from source: magic vars 11000 1726867156.99550: variable 'omit' from source: magic vars 11000 1726867156.99585: variable 'omit' from source: magic vars 11000 1726867156.99620: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867156.99657: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867156.99678: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867156.99697: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867156.99709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867156.99743: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867156.99746: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.99749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867156.99839: Set connection var ansible_shell_type to sh 11000 1726867156.99846: Set connection var ansible_pipelining to False 11000 1726867156.99855: Set connection var ansible_shell_executable to /bin/sh 11000 1726867156.99858: Set connection var ansible_connection to ssh 11000 1726867156.99861: Set connection var ansible_timeout to 10 11000 1726867156.99868: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867156.99895: variable 'ansible_shell_executable' from source: unknown 11000 1726867156.99899: variable 'ansible_connection' from source: unknown 11000 1726867156.99902: variable 'ansible_module_compression' from source: unknown 11000 1726867156.99904: variable 'ansible_shell_type' from source: unknown 11000 1726867156.99907: variable 'ansible_shell_executable' from source: unknown 11000 1726867156.99909: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867156.99912: variable 'ansible_pipelining' from source: unknown 11000 1726867156.99915: variable 'ansible_timeout' from source: unknown 11000 1726867156.99919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.00052: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867157.00070: variable 'omit' from source: magic vars 11000 1726867157.00078: starting attempt loop 11000 1726867157.00082: running the handler 11000 1726867157.00085: handler run complete 11000 1726867157.00180: attempt loop complete, returning result 11000 1726867157.00184: _execute() done 11000 1726867157.00185: dumping result to json 11000 1726867157.00189: done dumping result, returning 11000 1726867157.00191: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-c734-026a-0000000003b3] 11000 1726867157.00193: sending task result for task 0affcac9-a3a5-c734-026a-0000000003b3 11000 1726867157.00243: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003b3 11000 1726867157.00246: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11000 1726867157.00297: no more pending results, returning what we have 11000 1726867157.00299: results queue empty 11000 1726867157.00300: checking for any_errors_fatal 11000 1726867157.00301: done checking for any_errors_fatal 11000 1726867157.00302: checking for max_fail_percentage 11000 1726867157.00303: done checking for max_fail_percentage 11000 1726867157.00304: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.00305: done checking to see if all hosts have failed 11000 1726867157.00305: getting the remaining hosts for this loop 11000 1726867157.00307: done getting the remaining hosts for this loop 11000 1726867157.00309: getting the next task for host managed_node1 11000 1726867157.00315: done getting next task for host managed_node1 11000 1726867157.00317: ^ task is: TASK: Stat profile file 11000 1726867157.00320: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.00322: getting variables 11000 1726867157.00323: in VariableManager get_vars() 11000 1726867157.00353: Calling all_inventory to load vars for managed_node1 11000 1726867157.00355: Calling groups_inventory to load vars for managed_node1 11000 1726867157.00357: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.00365: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.00367: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.00369: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.01808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.03546: done with get_vars() 11000 1726867157.03572: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:19:17 -0400 (0:00:00.052) 0:00:18.679 ****** 11000 1726867157.03659: entering _queue_task() for managed_node1/stat 11000 1726867157.03968: worker is 1 (out of 1 available) 11000 1726867157.03981: exiting _queue_task() for managed_node1/stat 11000 1726867157.03996: done queuing things up, now waiting for results queue to drain 11000 1726867157.03997: waiting for pending results... 11000 1726867157.04403: running TaskExecutor() for managed_node1/TASK: Stat profile file 11000 1726867157.04407: in run() - task 0affcac9-a3a5-c734-026a-0000000003b4 11000 1726867157.04410: variable 'ansible_search_path' from source: unknown 11000 1726867157.04412: variable 'ansible_search_path' from source: unknown 11000 1726867157.04415: calling self._execute() 11000 1726867157.04437: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.04444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.04453: variable 'omit' from source: magic vars 11000 1726867157.04803: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.04815: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.04828: variable 'omit' from source: magic vars 11000 1726867157.04864: variable 'omit' from source: magic vars 11000 1726867157.04955: variable 'profile' from source: include params 11000 1726867157.04958: variable 'item' from source: include params 11000 1726867157.05023: variable 'item' from source: include params 11000 1726867157.05044: variable 'omit' from source: magic vars 11000 1726867157.05082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867157.05118: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867157.05138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867157.05155: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.05168: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.05199: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867157.05202: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.05205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.05296: Set connection var ansible_shell_type to sh 11000 1726867157.05304: Set connection var ansible_pipelining to False 11000 1726867157.05313: Set connection var ansible_shell_executable to /bin/sh 11000 1726867157.05316: Set connection var ansible_connection to ssh 11000 1726867157.05321: Set connection var ansible_timeout to 10 11000 1726867157.05327: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867157.05353: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.05356: variable 'ansible_connection' from source: unknown 11000 1726867157.05359: variable 'ansible_module_compression' from source: unknown 11000 1726867157.05479: variable 'ansible_shell_type' from source: unknown 11000 1726867157.05483: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.05486: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.05491: variable 'ansible_pipelining' from source: unknown 11000 1726867157.05494: variable 'ansible_timeout' from source: unknown 11000 1726867157.05496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.05585: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867157.05592: variable 'omit' from source: magic vars 11000 1726867157.05595: starting attempt loop 11000 1726867157.05598: running the handler 11000 1726867157.05600: _low_level_execute_command(): starting 11000 1726867157.05602: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867157.06349: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867157.06352: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.06355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.06358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.06361: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867157.06363: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867157.06365: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.06376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867157.06380: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867157.06383: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867157.06385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.06387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.06400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.06407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867157.06415: stderr chunk (state=3): >>>debug2: match found <<< 11000 1726867157.06425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.06505: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.06538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.06598: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.08276: stdout chunk (state=3): >>>/root <<< 11000 1726867157.08396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.08450: stderr chunk (state=3): >>><<< 11000 1726867157.08453: stdout chunk (state=3): >>><<< 11000 1726867157.08563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.08566: _low_level_execute_command(): starting 11000 1726867157.08570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142 `" && echo ansible-tmp-1726867157.0847826-11877-127254996227142="` echo /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142 `" ) && sleep 0' 11000 1726867157.09114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867157.09131: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.09153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.09170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.09191: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867157.09258: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.09307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867157.09332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.09411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.11313: stdout chunk (state=3): >>>ansible-tmp-1726867157.0847826-11877-127254996227142=/root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142 <<< 11000 1726867157.11683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.11686: stdout chunk (state=3): >>><<< 11000 1726867157.11689: stderr chunk (state=3): >>><<< 11000 1726867157.11691: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867157.0847826-11877-127254996227142=/root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.11693: variable 'ansible_module_compression' from source: unknown 11000 1726867157.11749: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11000 1726867157.11852: variable 'ansible_facts' from source: unknown 11000 1726867157.11955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/AnsiballZ_stat.py 11000 1726867157.12314: Sending initial data 11000 1726867157.12324: Sent initial data (153 bytes) 11000 1726867157.13289: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.13303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.13314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.13401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.13809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.13832: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.15400: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11000 1726867157.15405: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867157.15494: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867157.15538: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmp8mu0fmrt /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/AnsiballZ_stat.py <<< 11000 1726867157.15620: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/AnsiballZ_stat.py" <<< 11000 1726867157.15653: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmp8mu0fmrt" to remote "/root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/AnsiballZ_stat.py" <<< 11000 1726867157.16515: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.16518: stdout chunk (state=3): >>><<< 11000 1726867157.16622: stderr chunk (state=3): >>><<< 11000 1726867157.16626: done transferring module to remote 11000 1726867157.16628: _low_level_execute_command(): starting 11000 1726867157.16630: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/ /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/AnsiballZ_stat.py && sleep 0' 11000 1726867157.17057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.17061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.17069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867157.17071: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.17073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.17119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.17122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.17171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.19042: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.19046: stdout chunk (state=3): >>><<< 11000 1726867157.19176: stderr chunk (state=3): >>><<< 11000 1726867157.19182: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.19185: _low_level_execute_command(): starting 11000 1726867157.19187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/AnsiballZ_stat.py && sleep 0' 11000 1726867157.20286: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867157.20294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.20297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.20299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.20302: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867157.20304: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867157.20306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.20309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867157.20311: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867157.20315: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867157.20317: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.20545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.20548: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867157.20551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.20673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.35725: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11000 1726867157.37047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867157.37074: stderr chunk (state=3): >>><<< 11000 1726867157.37085: stdout chunk (state=3): >>><<< 11000 1726867157.37097: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867157.37119: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867157.37127: _low_level_execute_command(): starting 11000 1726867157.37132: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867157.0847826-11877-127254996227142/ > /dev/null 2>&1 && sleep 0' 11000 1726867157.37781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.37791: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.37826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.39646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.39672: stderr chunk (state=3): >>><<< 11000 1726867157.39675: stdout chunk (state=3): >>><<< 11000 1726867157.39690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.39698: handler run complete 11000 1726867157.39713: attempt loop complete, returning result 11000 1726867157.39716: _execute() done 11000 1726867157.39719: dumping result to json 11000 1726867157.39721: done dumping result, returning 11000 1726867157.39729: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affcac9-a3a5-c734-026a-0000000003b4] 11000 1726867157.39731: sending task result for task 0affcac9-a3a5-c734-026a-0000000003b4 11000 1726867157.39824: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003b4 11000 1726867157.39827: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11000 1726867157.39884: no more pending results, returning what we have 11000 1726867157.39887: results queue empty 11000 1726867157.39888: checking for any_errors_fatal 11000 1726867157.39892: done checking for any_errors_fatal 11000 1726867157.39893: checking for max_fail_percentage 11000 1726867157.39894: done checking for max_fail_percentage 11000 1726867157.39895: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.39896: done checking to see if all hosts have failed 11000 1726867157.39897: getting the remaining hosts for this loop 11000 1726867157.39898: done getting the remaining hosts for this loop 11000 1726867157.39901: getting the next task for host managed_node1 11000 1726867157.39908: done getting next task for host managed_node1 11000 1726867157.39911: ^ task is: TASK: Set NM profile exist flag based on the profile files 11000 1726867157.39914: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.39918: getting variables 11000 1726867157.39919: in VariableManager get_vars() 11000 1726867157.39960: Calling all_inventory to load vars for managed_node1 11000 1726867157.39963: Calling groups_inventory to load vars for managed_node1 11000 1726867157.39965: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.39975: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.39980: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.39983: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.41020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.42011: done with get_vars() 11000 1726867157.42025: done getting variables 11000 1726867157.42066: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:19:17 -0400 (0:00:00.384) 0:00:19.063 ****** 11000 1726867157.42090: entering _queue_task() for managed_node1/set_fact 11000 1726867157.42297: worker is 1 (out of 1 available) 11000 1726867157.42308: exiting _queue_task() for managed_node1/set_fact 11000 1726867157.42318: done queuing things up, now waiting for results queue to drain 11000 1726867157.42320: waiting for pending results... 11000 1726867157.42489: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11000 1726867157.42557: in run() - task 0affcac9-a3a5-c734-026a-0000000003b5 11000 1726867157.42568: variable 'ansible_search_path' from source: unknown 11000 1726867157.42571: variable 'ansible_search_path' from source: unknown 11000 1726867157.42603: calling self._execute() 11000 1726867157.42672: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.42680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.42688: variable 'omit' from source: magic vars 11000 1726867157.42956: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.42966: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.43052: variable 'profile_stat' from source: set_fact 11000 1726867157.43063: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867157.43066: when evaluation is False, skipping this task 11000 1726867157.43068: _execute() done 11000 1726867157.43071: dumping result to json 11000 1726867157.43073: done dumping result, returning 11000 1726867157.43082: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-c734-026a-0000000003b5] 11000 1726867157.43088: sending task result for task 0affcac9-a3a5-c734-026a-0000000003b5 11000 1726867157.43167: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003b5 11000 1726867157.43170: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867157.43241: no more pending results, returning what we have 11000 1726867157.43244: results queue empty 11000 1726867157.43245: checking for any_errors_fatal 11000 1726867157.43250: done checking for any_errors_fatal 11000 1726867157.43251: checking for max_fail_percentage 11000 1726867157.43252: done checking for max_fail_percentage 11000 1726867157.43253: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.43254: done checking to see if all hosts have failed 11000 1726867157.43254: getting the remaining hosts for this loop 11000 1726867157.43256: done getting the remaining hosts for this loop 11000 1726867157.43258: getting the next task for host managed_node1 11000 1726867157.43263: done getting next task for host managed_node1 11000 1726867157.43265: ^ task is: TASK: Get NM profile info 11000 1726867157.43268: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.43272: getting variables 11000 1726867157.43273: in VariableManager get_vars() 11000 1726867157.43307: Calling all_inventory to load vars for managed_node1 11000 1726867157.43309: Calling groups_inventory to load vars for managed_node1 11000 1726867157.43311: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.43320: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.43322: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.43325: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.44140: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.44984: done with get_vars() 11000 1726867157.44999: done getting variables 11000 1726867157.45037: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:19:17 -0400 (0:00:00.029) 0:00:19.093 ****** 11000 1726867157.45057: entering _queue_task() for managed_node1/shell 11000 1726867157.45248: worker is 1 (out of 1 available) 11000 1726867157.45260: exiting _queue_task() for managed_node1/shell 11000 1726867157.45272: done queuing things up, now waiting for results queue to drain 11000 1726867157.45273: waiting for pending results... 11000 1726867157.45429: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11000 1726867157.45507: in run() - task 0affcac9-a3a5-c734-026a-0000000003b6 11000 1726867157.45515: variable 'ansible_search_path' from source: unknown 11000 1726867157.45519: variable 'ansible_search_path' from source: unknown 11000 1726867157.45544: calling self._execute() 11000 1726867157.45618: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.45623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.45631: variable 'omit' from source: magic vars 11000 1726867157.45899: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.45910: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.45916: variable 'omit' from source: magic vars 11000 1726867157.45948: variable 'omit' from source: magic vars 11000 1726867157.46017: variable 'profile' from source: include params 11000 1726867157.46021: variable 'item' from source: include params 11000 1726867157.46070: variable 'item' from source: include params 11000 1726867157.46087: variable 'omit' from source: magic vars 11000 1726867157.46117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867157.46145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867157.46161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867157.46174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.46185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.46209: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867157.46212: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.46214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.46280: Set connection var ansible_shell_type to sh 11000 1726867157.46289: Set connection var ansible_pipelining to False 11000 1726867157.46295: Set connection var ansible_shell_executable to /bin/sh 11000 1726867157.46298: Set connection var ansible_connection to ssh 11000 1726867157.46303: Set connection var ansible_timeout to 10 11000 1726867157.46308: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867157.46328: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.46331: variable 'ansible_connection' from source: unknown 11000 1726867157.46333: variable 'ansible_module_compression' from source: unknown 11000 1726867157.46336: variable 'ansible_shell_type' from source: unknown 11000 1726867157.46338: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.46340: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.46342: variable 'ansible_pipelining' from source: unknown 11000 1726867157.46345: variable 'ansible_timeout' from source: unknown 11000 1726867157.46350: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.46448: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867157.46458: variable 'omit' from source: magic vars 11000 1726867157.46462: starting attempt loop 11000 1726867157.46465: running the handler 11000 1726867157.46476: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867157.46495: _low_level_execute_command(): starting 11000 1726867157.46503: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867157.46986: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.46992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.46995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.46998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.47038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.47054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.47113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.48789: stdout chunk (state=3): >>>/root <<< 11000 1726867157.48893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.48920: stderr chunk (state=3): >>><<< 11000 1726867157.48923: stdout chunk (state=3): >>><<< 11000 1726867157.48940: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.48951: _low_level_execute_command(): starting 11000 1726867157.48955: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992 `" && echo ansible-tmp-1726867157.4894028-11905-169858963805992="` echo /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992 `" ) && sleep 0' 11000 1726867157.49345: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.49356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.49375: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867157.49382: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.49437: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867157.49440: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.49493: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.51417: stdout chunk (state=3): >>>ansible-tmp-1726867157.4894028-11905-169858963805992=/root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992 <<< 11000 1726867157.51524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.51547: stderr chunk (state=3): >>><<< 11000 1726867157.51550: stdout chunk (state=3): >>><<< 11000 1726867157.51566: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867157.4894028-11905-169858963805992=/root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.51593: variable 'ansible_module_compression' from source: unknown 11000 1726867157.51627: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867157.51657: variable 'ansible_facts' from source: unknown 11000 1726867157.51718: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/AnsiballZ_command.py 11000 1726867157.51810: Sending initial data 11000 1726867157.51814: Sent initial data (156 bytes) 11000 1726867157.52230: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.52233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.52236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.52238: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.52240: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.52274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.52289: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.52340: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.53893: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867157.53934: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867157.53980: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpa9lab8uj /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/AnsiballZ_command.py <<< 11000 1726867157.53986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/AnsiballZ_command.py" <<< 11000 1726867157.54025: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpa9lab8uj" to remote "/root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/AnsiballZ_command.py" <<< 11000 1726867157.54028: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/AnsiballZ_command.py" <<< 11000 1726867157.54569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.54607: stderr chunk (state=3): >>><<< 11000 1726867157.54611: stdout chunk (state=3): >>><<< 11000 1726867157.54648: done transferring module to remote 11000 1726867157.54655: _low_level_execute_command(): starting 11000 1726867157.54658: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/ /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/AnsiballZ_command.py && sleep 0' 11000 1726867157.55067: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.55071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867157.55073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.55080: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.55083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867157.55085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.55122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.55125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.55176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.56920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.56940: stderr chunk (state=3): >>><<< 11000 1726867157.56943: stdout chunk (state=3): >>><<< 11000 1726867157.56954: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.56957: _low_level_execute_command(): starting 11000 1726867157.56962: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/AnsiballZ_command.py && sleep 0' 11000 1726867157.57341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.57345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.57358: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.57410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867157.57413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.57466: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.75187: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 17:19:17.725518", "end": "2024-09-20 17:19:17.749944", "delta": "0:00:00.024426", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867157.76805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867157.76829: stderr chunk (state=3): >>><<< 11000 1726867157.76833: stdout chunk (state=3): >>><<< 11000 1726867157.76847: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 17:19:17.725518", "end": "2024-09-20 17:19:17.749944", "delta": "0:00:00.024426", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867157.76884: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867157.76894: _low_level_execute_command(): starting 11000 1726867157.76899: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867157.4894028-11905-169858963805992/ > /dev/null 2>&1 && sleep 0' 11000 1726867157.77337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867157.77341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.77343: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867157.77345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867157.77347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867157.77383: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867157.77400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867157.77448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867157.79268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867157.79294: stderr chunk (state=3): >>><<< 11000 1726867157.79297: stdout chunk (state=3): >>><<< 11000 1726867157.79309: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867157.79315: handler run complete 11000 1726867157.79331: Evaluated conditional (False): False 11000 1726867157.79339: attempt loop complete, returning result 11000 1726867157.79342: _execute() done 11000 1726867157.79345: dumping result to json 11000 1726867157.79349: done dumping result, returning 11000 1726867157.79356: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affcac9-a3a5-c734-026a-0000000003b6] 11000 1726867157.79361: sending task result for task 0affcac9-a3a5-c734-026a-0000000003b6 11000 1726867157.79456: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003b6 11000 1726867157.79459: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.024426", "end": "2024-09-20 17:19:17.749944", "rc": 0, "start": "2024-09-20 17:19:17.725518" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11000 1726867157.79558: no more pending results, returning what we have 11000 1726867157.79561: results queue empty 11000 1726867157.79562: checking for any_errors_fatal 11000 1726867157.79567: done checking for any_errors_fatal 11000 1726867157.79568: checking for max_fail_percentage 11000 1726867157.79569: done checking for max_fail_percentage 11000 1726867157.79570: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.79571: done checking to see if all hosts have failed 11000 1726867157.79571: getting the remaining hosts for this loop 11000 1726867157.79573: done getting the remaining hosts for this loop 11000 1726867157.79576: getting the next task for host managed_node1 11000 1726867157.79584: done getting next task for host managed_node1 11000 1726867157.79586: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11000 1726867157.79590: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.79595: getting variables 11000 1726867157.79597: in VariableManager get_vars() 11000 1726867157.79633: Calling all_inventory to load vars for managed_node1 11000 1726867157.79636: Calling groups_inventory to load vars for managed_node1 11000 1726867157.79638: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.79648: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.79650: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.79653: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.80424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.81393: done with get_vars() 11000 1726867157.81407: done getting variables 11000 1726867157.81449: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:19:17 -0400 (0:00:00.364) 0:00:19.457 ****** 11000 1726867157.81470: entering _queue_task() for managed_node1/set_fact 11000 1726867157.81688: worker is 1 (out of 1 available) 11000 1726867157.81701: exiting _queue_task() for managed_node1/set_fact 11000 1726867157.81714: done queuing things up, now waiting for results queue to drain 11000 1726867157.81716: waiting for pending results... 11000 1726867157.81876: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11000 1726867157.81947: in run() - task 0affcac9-a3a5-c734-026a-0000000003b7 11000 1726867157.81957: variable 'ansible_search_path' from source: unknown 11000 1726867157.81960: variable 'ansible_search_path' from source: unknown 11000 1726867157.81994: calling self._execute() 11000 1726867157.82063: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.82067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.82082: variable 'omit' from source: magic vars 11000 1726867157.82346: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.82356: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.82451: variable 'nm_profile_exists' from source: set_fact 11000 1726867157.82462: Evaluated conditional (nm_profile_exists.rc == 0): True 11000 1726867157.82467: variable 'omit' from source: magic vars 11000 1726867157.82499: variable 'omit' from source: magic vars 11000 1726867157.82523: variable 'omit' from source: magic vars 11000 1726867157.82553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867157.82580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867157.82598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867157.82613: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.82623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.82645: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867157.82648: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.82651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.82717: Set connection var ansible_shell_type to sh 11000 1726867157.82725: Set connection var ansible_pipelining to False 11000 1726867157.82734: Set connection var ansible_shell_executable to /bin/sh 11000 1726867157.82737: Set connection var ansible_connection to ssh 11000 1726867157.82739: Set connection var ansible_timeout to 10 11000 1726867157.82744: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867157.82763: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.82766: variable 'ansible_connection' from source: unknown 11000 1726867157.82769: variable 'ansible_module_compression' from source: unknown 11000 1726867157.82771: variable 'ansible_shell_type' from source: unknown 11000 1726867157.82773: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.82776: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.82782: variable 'ansible_pipelining' from source: unknown 11000 1726867157.82784: variable 'ansible_timeout' from source: unknown 11000 1726867157.82787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.82890: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867157.82902: variable 'omit' from source: magic vars 11000 1726867157.82907: starting attempt loop 11000 1726867157.82911: running the handler 11000 1726867157.82921: handler run complete 11000 1726867157.82929: attempt loop complete, returning result 11000 1726867157.82932: _execute() done 11000 1726867157.82934: dumping result to json 11000 1726867157.82937: done dumping result, returning 11000 1726867157.82946: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-c734-026a-0000000003b7] 11000 1726867157.82948: sending task result for task 0affcac9-a3a5-c734-026a-0000000003b7 11000 1726867157.83024: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003b7 11000 1726867157.83027: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11000 1726867157.83079: no more pending results, returning what we have 11000 1726867157.83081: results queue empty 11000 1726867157.83082: checking for any_errors_fatal 11000 1726867157.83089: done checking for any_errors_fatal 11000 1726867157.83089: checking for max_fail_percentage 11000 1726867157.83091: done checking for max_fail_percentage 11000 1726867157.83092: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.83092: done checking to see if all hosts have failed 11000 1726867157.83093: getting the remaining hosts for this loop 11000 1726867157.83095: done getting the remaining hosts for this loop 11000 1726867157.83098: getting the next task for host managed_node1 11000 1726867157.83105: done getting next task for host managed_node1 11000 1726867157.83107: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11000 1726867157.83110: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.83113: getting variables 11000 1726867157.83115: in VariableManager get_vars() 11000 1726867157.83146: Calling all_inventory to load vars for managed_node1 11000 1726867157.83148: Calling groups_inventory to load vars for managed_node1 11000 1726867157.83150: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.83158: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.83160: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.83162: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.83887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.84731: done with get_vars() 11000 1726867157.84745: done getting variables 11000 1726867157.84785: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867157.84868: variable 'profile' from source: include params 11000 1726867157.84871: variable 'item' from source: include params 11000 1726867157.84915: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:19:17 -0400 (0:00:00.034) 0:00:19.492 ****** 11000 1726867157.84940: entering _queue_task() for managed_node1/command 11000 1726867157.85140: worker is 1 (out of 1 available) 11000 1726867157.85153: exiting _queue_task() for managed_node1/command 11000 1726867157.85164: done queuing things up, now waiting for results queue to drain 11000 1726867157.85165: waiting for pending results... 11000 1726867157.85324: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 11000 1726867157.85395: in run() - task 0affcac9-a3a5-c734-026a-0000000003b9 11000 1726867157.85406: variable 'ansible_search_path' from source: unknown 11000 1726867157.85410: variable 'ansible_search_path' from source: unknown 11000 1726867157.85434: calling self._execute() 11000 1726867157.85502: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.85511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.85517: variable 'omit' from source: magic vars 11000 1726867157.85764: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.85773: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.85856: variable 'profile_stat' from source: set_fact 11000 1726867157.85868: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867157.85871: when evaluation is False, skipping this task 11000 1726867157.85874: _execute() done 11000 1726867157.85876: dumping result to json 11000 1726867157.85881: done dumping result, returning 11000 1726867157.85886: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [0affcac9-a3a5-c734-026a-0000000003b9] 11000 1726867157.85893: sending task result for task 0affcac9-a3a5-c734-026a-0000000003b9 11000 1726867157.85969: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003b9 11000 1726867157.85972: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867157.86022: no more pending results, returning what we have 11000 1726867157.86024: results queue empty 11000 1726867157.86025: checking for any_errors_fatal 11000 1726867157.86030: done checking for any_errors_fatal 11000 1726867157.86031: checking for max_fail_percentage 11000 1726867157.86033: done checking for max_fail_percentage 11000 1726867157.86034: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.86035: done checking to see if all hosts have failed 11000 1726867157.86035: getting the remaining hosts for this loop 11000 1726867157.86037: done getting the remaining hosts for this loop 11000 1726867157.86040: getting the next task for host managed_node1 11000 1726867157.86046: done getting next task for host managed_node1 11000 1726867157.86047: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11000 1726867157.86051: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.86054: getting variables 11000 1726867157.86055: in VariableManager get_vars() 11000 1726867157.86092: Calling all_inventory to load vars for managed_node1 11000 1726867157.86095: Calling groups_inventory to load vars for managed_node1 11000 1726867157.86097: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.86105: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.86108: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.86110: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.86936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.87773: done with get_vars() 11000 1726867157.87789: done getting variables 11000 1726867157.87829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867157.87898: variable 'profile' from source: include params 11000 1726867157.87901: variable 'item' from source: include params 11000 1726867157.87941: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:19:17 -0400 (0:00:00.030) 0:00:19.522 ****** 11000 1726867157.87961: entering _queue_task() for managed_node1/set_fact 11000 1726867157.88151: worker is 1 (out of 1 available) 11000 1726867157.88162: exiting _queue_task() for managed_node1/set_fact 11000 1726867157.88173: done queuing things up, now waiting for results queue to drain 11000 1726867157.88174: waiting for pending results... 11000 1726867157.88334: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 11000 1726867157.88413: in run() - task 0affcac9-a3a5-c734-026a-0000000003ba 11000 1726867157.88425: variable 'ansible_search_path' from source: unknown 11000 1726867157.88429: variable 'ansible_search_path' from source: unknown 11000 1726867157.88455: calling self._execute() 11000 1726867157.88526: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.88530: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.88539: variable 'omit' from source: magic vars 11000 1726867157.88785: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.88796: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.88875: variable 'profile_stat' from source: set_fact 11000 1726867157.88890: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867157.88893: when evaluation is False, skipping this task 11000 1726867157.88896: _execute() done 11000 1726867157.88899: dumping result to json 11000 1726867157.88902: done dumping result, returning 11000 1726867157.88905: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [0affcac9-a3a5-c734-026a-0000000003ba] 11000 1726867157.88910: sending task result for task 0affcac9-a3a5-c734-026a-0000000003ba 11000 1726867157.88990: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003ba 11000 1726867157.88993: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867157.89032: no more pending results, returning what we have 11000 1726867157.89035: results queue empty 11000 1726867157.89036: checking for any_errors_fatal 11000 1726867157.89040: done checking for any_errors_fatal 11000 1726867157.89041: checking for max_fail_percentage 11000 1726867157.89042: done checking for max_fail_percentage 11000 1726867157.89043: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.89044: done checking to see if all hosts have failed 11000 1726867157.89044: getting the remaining hosts for this loop 11000 1726867157.89046: done getting the remaining hosts for this loop 11000 1726867157.89048: getting the next task for host managed_node1 11000 1726867157.89054: done getting next task for host managed_node1 11000 1726867157.89056: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11000 1726867157.89059: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.89063: getting variables 11000 1726867157.89064: in VariableManager get_vars() 11000 1726867157.89099: Calling all_inventory to load vars for managed_node1 11000 1726867157.89101: Calling groups_inventory to load vars for managed_node1 11000 1726867157.89103: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.89112: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.89114: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.89117: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.89834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.90684: done with get_vars() 11000 1726867157.90700: done getting variables 11000 1726867157.90736: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867157.90808: variable 'profile' from source: include params 11000 1726867157.90811: variable 'item' from source: include params 11000 1726867157.90848: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:19:17 -0400 (0:00:00.029) 0:00:19.551 ****** 11000 1726867157.90867: entering _queue_task() for managed_node1/command 11000 1726867157.91055: worker is 1 (out of 1 available) 11000 1726867157.91068: exiting _queue_task() for managed_node1/command 11000 1726867157.91081: done queuing things up, now waiting for results queue to drain 11000 1726867157.91082: waiting for pending results... 11000 1726867157.91225: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 11000 1726867157.91281: in run() - task 0affcac9-a3a5-c734-026a-0000000003bb 11000 1726867157.91293: variable 'ansible_search_path' from source: unknown 11000 1726867157.91297: variable 'ansible_search_path' from source: unknown 11000 1726867157.91324: calling self._execute() 11000 1726867157.91391: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.91395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.91403: variable 'omit' from source: magic vars 11000 1726867157.91645: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.91655: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.91732: variable 'profile_stat' from source: set_fact 11000 1726867157.91745: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867157.91749: when evaluation is False, skipping this task 11000 1726867157.91752: _execute() done 11000 1726867157.91755: dumping result to json 11000 1726867157.91757: done dumping result, returning 11000 1726867157.91760: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 [0affcac9-a3a5-c734-026a-0000000003bb] 11000 1726867157.91767: sending task result for task 0affcac9-a3a5-c734-026a-0000000003bb 11000 1726867157.91841: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003bb 11000 1726867157.91847: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867157.91902: no more pending results, returning what we have 11000 1726867157.91904: results queue empty 11000 1726867157.91905: checking for any_errors_fatal 11000 1726867157.91910: done checking for any_errors_fatal 11000 1726867157.91910: checking for max_fail_percentage 11000 1726867157.91912: done checking for max_fail_percentage 11000 1726867157.91913: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.91913: done checking to see if all hosts have failed 11000 1726867157.91914: getting the remaining hosts for this loop 11000 1726867157.91915: done getting the remaining hosts for this loop 11000 1726867157.91918: getting the next task for host managed_node1 11000 1726867157.91922: done getting next task for host managed_node1 11000 1726867157.91924: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11000 1726867157.91928: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.91931: getting variables 11000 1726867157.91932: in VariableManager get_vars() 11000 1726867157.91960: Calling all_inventory to load vars for managed_node1 11000 1726867157.91963: Calling groups_inventory to load vars for managed_node1 11000 1726867157.91965: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.91973: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.91975: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.91986: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.92796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.93626: done with get_vars() 11000 1726867157.93640: done getting variables 11000 1726867157.93676: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867157.93746: variable 'profile' from source: include params 11000 1726867157.93749: variable 'item' from source: include params 11000 1726867157.93785: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:19:17 -0400 (0:00:00.029) 0:00:19.581 ****** 11000 1726867157.93806: entering _queue_task() for managed_node1/set_fact 11000 1726867157.93984: worker is 1 (out of 1 available) 11000 1726867157.93994: exiting _queue_task() for managed_node1/set_fact 11000 1726867157.94005: done queuing things up, now waiting for results queue to drain 11000 1726867157.94006: waiting for pending results... 11000 1726867157.94170: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 11000 1726867157.94243: in run() - task 0affcac9-a3a5-c734-026a-0000000003bc 11000 1726867157.94255: variable 'ansible_search_path' from source: unknown 11000 1726867157.94261: variable 'ansible_search_path' from source: unknown 11000 1726867157.94287: calling self._execute() 11000 1726867157.94354: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.94359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.94367: variable 'omit' from source: magic vars 11000 1726867157.94620: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.94629: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.94711: variable 'profile_stat' from source: set_fact 11000 1726867157.94723: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867157.94726: when evaluation is False, skipping this task 11000 1726867157.94729: _execute() done 11000 1726867157.94731: dumping result to json 11000 1726867157.94734: done dumping result, returning 11000 1726867157.94739: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [0affcac9-a3a5-c734-026a-0000000003bc] 11000 1726867157.94744: sending task result for task 0affcac9-a3a5-c734-026a-0000000003bc 11000 1726867157.94820: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003bc 11000 1726867157.94823: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867157.94863: no more pending results, returning what we have 11000 1726867157.94866: results queue empty 11000 1726867157.94867: checking for any_errors_fatal 11000 1726867157.94871: done checking for any_errors_fatal 11000 1726867157.94872: checking for max_fail_percentage 11000 1726867157.94873: done checking for max_fail_percentage 11000 1726867157.94875: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.94875: done checking to see if all hosts have failed 11000 1726867157.94876: getting the remaining hosts for this loop 11000 1726867157.94880: done getting the remaining hosts for this loop 11000 1726867157.94882: getting the next task for host managed_node1 11000 1726867157.94889: done getting next task for host managed_node1 11000 1726867157.94891: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11000 1726867157.94894: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.94897: getting variables 11000 1726867157.94899: in VariableManager get_vars() 11000 1726867157.94929: Calling all_inventory to load vars for managed_node1 11000 1726867157.94931: Calling groups_inventory to load vars for managed_node1 11000 1726867157.94933: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.94942: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.94944: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.94947: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.95651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.96483: done with get_vars() 11000 1726867157.96499: done getting variables 11000 1726867157.96537: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867157.96609: variable 'profile' from source: include params 11000 1726867157.96613: variable 'item' from source: include params 11000 1726867157.96649: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:19:17 -0400 (0:00:00.028) 0:00:19.609 ****** 11000 1726867157.96668: entering _queue_task() for managed_node1/assert 11000 1726867157.96846: worker is 1 (out of 1 available) 11000 1726867157.96859: exiting _queue_task() for managed_node1/assert 11000 1726867157.96869: done queuing things up, now waiting for results queue to drain 11000 1726867157.96871: waiting for pending results... 11000 1726867157.97029: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' 11000 1726867157.97083: in run() - task 0affcac9-a3a5-c734-026a-000000000261 11000 1726867157.97096: variable 'ansible_search_path' from source: unknown 11000 1726867157.97101: variable 'ansible_search_path' from source: unknown 11000 1726867157.97127: calling self._execute() 11000 1726867157.97194: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.97199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.97207: variable 'omit' from source: magic vars 11000 1726867157.97458: variable 'ansible_distribution_major_version' from source: facts 11000 1726867157.97468: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867157.97473: variable 'omit' from source: magic vars 11000 1726867157.97502: variable 'omit' from source: magic vars 11000 1726867157.97569: variable 'profile' from source: include params 11000 1726867157.97572: variable 'item' from source: include params 11000 1726867157.97618: variable 'item' from source: include params 11000 1726867157.97635: variable 'omit' from source: magic vars 11000 1726867157.97666: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867157.97694: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867157.97709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867157.97722: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.97731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867157.97760: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867157.97763: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.97765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.97829: Set connection var ansible_shell_type to sh 11000 1726867157.97835: Set connection var ansible_pipelining to False 11000 1726867157.97842: Set connection var ansible_shell_executable to /bin/sh 11000 1726867157.97844: Set connection var ansible_connection to ssh 11000 1726867157.97849: Set connection var ansible_timeout to 10 11000 1726867157.97856: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867157.97878: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.97882: variable 'ansible_connection' from source: unknown 11000 1726867157.97884: variable 'ansible_module_compression' from source: unknown 11000 1726867157.97887: variable 'ansible_shell_type' from source: unknown 11000 1726867157.97893: variable 'ansible_shell_executable' from source: unknown 11000 1726867157.97895: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867157.97897: variable 'ansible_pipelining' from source: unknown 11000 1726867157.97900: variable 'ansible_timeout' from source: unknown 11000 1726867157.97902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867157.97999: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867157.98008: variable 'omit' from source: magic vars 11000 1726867157.98012: starting attempt loop 11000 1726867157.98015: running the handler 11000 1726867157.98092: variable 'lsr_net_profile_exists' from source: set_fact 11000 1726867157.98096: Evaluated conditional (lsr_net_profile_exists): True 11000 1726867157.98099: handler run complete 11000 1726867157.98110: attempt loop complete, returning result 11000 1726867157.98114: _execute() done 11000 1726867157.98116: dumping result to json 11000 1726867157.98118: done dumping result, returning 11000 1726867157.98124: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' [0affcac9-a3a5-c734-026a-000000000261] 11000 1726867157.98129: sending task result for task 0affcac9-a3a5-c734-026a-000000000261 11000 1726867157.98204: done sending task result for task 0affcac9-a3a5-c734-026a-000000000261 11000 1726867157.98207: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867157.98249: no more pending results, returning what we have 11000 1726867157.98251: results queue empty 11000 1726867157.98252: checking for any_errors_fatal 11000 1726867157.98257: done checking for any_errors_fatal 11000 1726867157.98257: checking for max_fail_percentage 11000 1726867157.98258: done checking for max_fail_percentage 11000 1726867157.98259: checking to see if all hosts have failed and the running result is not ok 11000 1726867157.98260: done checking to see if all hosts have failed 11000 1726867157.98261: getting the remaining hosts for this loop 11000 1726867157.98262: done getting the remaining hosts for this loop 11000 1726867157.98265: getting the next task for host managed_node1 11000 1726867157.98270: done getting next task for host managed_node1 11000 1726867157.98272: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11000 1726867157.98275: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867157.98280: getting variables 11000 1726867157.98281: in VariableManager get_vars() 11000 1726867157.98314: Calling all_inventory to load vars for managed_node1 11000 1726867157.98317: Calling groups_inventory to load vars for managed_node1 11000 1726867157.98319: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867157.98327: Calling all_plugins_play to load vars for managed_node1 11000 1726867157.98329: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867157.98332: Calling groups_plugins_play to load vars for managed_node1 11000 1726867157.99134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867157.99975: done with get_vars() 11000 1726867157.99992: done getting variables 11000 1726867158.00031: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867158.00109: variable 'profile' from source: include params 11000 1726867158.00112: variable 'item' from source: include params 11000 1726867158.00149: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:19:18 -0400 (0:00:00.034) 0:00:19.644 ****** 11000 1726867158.00173: entering _queue_task() for managed_node1/assert 11000 1726867158.00382: worker is 1 (out of 1 available) 11000 1726867158.00397: exiting _queue_task() for managed_node1/assert 11000 1726867158.00409: done queuing things up, now waiting for results queue to drain 11000 1726867158.00410: waiting for pending results... 11000 1726867158.00570: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' 11000 1726867158.00631: in run() - task 0affcac9-a3a5-c734-026a-000000000262 11000 1726867158.00642: variable 'ansible_search_path' from source: unknown 11000 1726867158.00645: variable 'ansible_search_path' from source: unknown 11000 1726867158.00673: calling self._execute() 11000 1726867158.00743: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.00747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.00755: variable 'omit' from source: magic vars 11000 1726867158.01014: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.01023: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.01030: variable 'omit' from source: magic vars 11000 1726867158.01058: variable 'omit' from source: magic vars 11000 1726867158.01129: variable 'profile' from source: include params 11000 1726867158.01133: variable 'item' from source: include params 11000 1726867158.01176: variable 'item' from source: include params 11000 1726867158.01193: variable 'omit' from source: magic vars 11000 1726867158.01226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867158.01253: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867158.01267: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867158.01281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.01294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.01317: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867158.01320: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.01323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.01387: Set connection var ansible_shell_type to sh 11000 1726867158.01394: Set connection var ansible_pipelining to False 11000 1726867158.01402: Set connection var ansible_shell_executable to /bin/sh 11000 1726867158.01407: Set connection var ansible_connection to ssh 11000 1726867158.01409: Set connection var ansible_timeout to 10 11000 1726867158.01421: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867158.01436: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.01439: variable 'ansible_connection' from source: unknown 11000 1726867158.01441: variable 'ansible_module_compression' from source: unknown 11000 1726867158.01443: variable 'ansible_shell_type' from source: unknown 11000 1726867158.01445: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.01447: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.01452: variable 'ansible_pipelining' from source: unknown 11000 1726867158.01455: variable 'ansible_timeout' from source: unknown 11000 1726867158.01459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.01558: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867158.01567: variable 'omit' from source: magic vars 11000 1726867158.01572: starting attempt loop 11000 1726867158.01574: running the handler 11000 1726867158.01648: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11000 1726867158.01652: Evaluated conditional (lsr_net_profile_ansible_managed): True 11000 1726867158.01658: handler run complete 11000 1726867158.01669: attempt loop complete, returning result 11000 1726867158.01672: _execute() done 11000 1726867158.01674: dumping result to json 11000 1726867158.01678: done dumping result, returning 11000 1726867158.01685: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' [0affcac9-a3a5-c734-026a-000000000262] 11000 1726867158.01692: sending task result for task 0affcac9-a3a5-c734-026a-000000000262 11000 1726867158.01763: done sending task result for task 0affcac9-a3a5-c734-026a-000000000262 11000 1726867158.01766: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867158.01813: no more pending results, returning what we have 11000 1726867158.01815: results queue empty 11000 1726867158.01816: checking for any_errors_fatal 11000 1726867158.01821: done checking for any_errors_fatal 11000 1726867158.01822: checking for max_fail_percentage 11000 1726867158.01823: done checking for max_fail_percentage 11000 1726867158.01824: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.01825: done checking to see if all hosts have failed 11000 1726867158.01826: getting the remaining hosts for this loop 11000 1726867158.01827: done getting the remaining hosts for this loop 11000 1726867158.01830: getting the next task for host managed_node1 11000 1726867158.01835: done getting next task for host managed_node1 11000 1726867158.01837: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11000 1726867158.01840: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.01843: getting variables 11000 1726867158.01844: in VariableManager get_vars() 11000 1726867158.01876: Calling all_inventory to load vars for managed_node1 11000 1726867158.01886: Calling groups_inventory to load vars for managed_node1 11000 1726867158.01891: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.01900: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.01902: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.01905: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.02632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.03560: done with get_vars() 11000 1726867158.03573: done getting variables 11000 1726867158.03612: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867158.03682: variable 'profile' from source: include params 11000 1726867158.03685: variable 'item' from source: include params 11000 1726867158.03723: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:19:18 -0400 (0:00:00.035) 0:00:19.680 ****** 11000 1726867158.03747: entering _queue_task() for managed_node1/assert 11000 1726867158.03933: worker is 1 (out of 1 available) 11000 1726867158.03944: exiting _queue_task() for managed_node1/assert 11000 1726867158.03956: done queuing things up, now waiting for results queue to drain 11000 1726867158.03957: waiting for pending results... 11000 1726867158.04124: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 11000 1726867158.04180: in run() - task 0affcac9-a3a5-c734-026a-000000000263 11000 1726867158.04195: variable 'ansible_search_path' from source: unknown 11000 1726867158.04198: variable 'ansible_search_path' from source: unknown 11000 1726867158.04225: calling self._execute() 11000 1726867158.04299: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.04303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.04311: variable 'omit' from source: magic vars 11000 1726867158.04564: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.04573: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.04580: variable 'omit' from source: magic vars 11000 1726867158.04609: variable 'omit' from source: magic vars 11000 1726867158.04676: variable 'profile' from source: include params 11000 1726867158.04681: variable 'item' from source: include params 11000 1726867158.04729: variable 'item' from source: include params 11000 1726867158.04745: variable 'omit' from source: magic vars 11000 1726867158.04774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867158.04804: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867158.04818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867158.04831: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.04848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.04869: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867158.04872: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.04874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.04940: Set connection var ansible_shell_type to sh 11000 1726867158.04946: Set connection var ansible_pipelining to False 11000 1726867158.04956: Set connection var ansible_shell_executable to /bin/sh 11000 1726867158.04960: Set connection var ansible_connection to ssh 11000 1726867158.04963: Set connection var ansible_timeout to 10 11000 1726867158.05067: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867158.05070: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.05075: variable 'ansible_connection' from source: unknown 11000 1726867158.05082: variable 'ansible_module_compression' from source: unknown 11000 1726867158.05085: variable 'ansible_shell_type' from source: unknown 11000 1726867158.05087: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.05089: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.05091: variable 'ansible_pipelining' from source: unknown 11000 1726867158.05093: variable 'ansible_timeout' from source: unknown 11000 1726867158.05096: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.05111: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867158.05121: variable 'omit' from source: magic vars 11000 1726867158.05126: starting attempt loop 11000 1726867158.05128: running the handler 11000 1726867158.05205: variable 'lsr_net_profile_fingerprint' from source: set_fact 11000 1726867158.05208: Evaluated conditional (lsr_net_profile_fingerprint): True 11000 1726867158.05214: handler run complete 11000 1726867158.05224: attempt loop complete, returning result 11000 1726867158.05227: _execute() done 11000 1726867158.05229: dumping result to json 11000 1726867158.05233: done dumping result, returning 11000 1726867158.05239: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 [0affcac9-a3a5-c734-026a-000000000263] 11000 1726867158.05244: sending task result for task 0affcac9-a3a5-c734-026a-000000000263 11000 1726867158.05320: done sending task result for task 0affcac9-a3a5-c734-026a-000000000263 11000 1726867158.05323: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867158.05365: no more pending results, returning what we have 11000 1726867158.05368: results queue empty 11000 1726867158.05369: checking for any_errors_fatal 11000 1726867158.05373: done checking for any_errors_fatal 11000 1726867158.05374: checking for max_fail_percentage 11000 1726867158.05375: done checking for max_fail_percentage 11000 1726867158.05376: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.05379: done checking to see if all hosts have failed 11000 1726867158.05380: getting the remaining hosts for this loop 11000 1726867158.05381: done getting the remaining hosts for this loop 11000 1726867158.05384: getting the next task for host managed_node1 11000 1726867158.05391: done getting next task for host managed_node1 11000 1726867158.05394: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11000 1726867158.05396: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.05400: getting variables 11000 1726867158.05401: in VariableManager get_vars() 11000 1726867158.05432: Calling all_inventory to load vars for managed_node1 11000 1726867158.05434: Calling groups_inventory to load vars for managed_node1 11000 1726867158.05436: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.05444: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.05447: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.05449: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.06164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.07006: done with get_vars() 11000 1726867158.07020: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:19:18 -0400 (0:00:00.033) 0:00:19.713 ****** 11000 1726867158.07076: entering _queue_task() for managed_node1/include_tasks 11000 1726867158.07256: worker is 1 (out of 1 available) 11000 1726867158.07269: exiting _queue_task() for managed_node1/include_tasks 11000 1726867158.07282: done queuing things up, now waiting for results queue to drain 11000 1726867158.07284: waiting for pending results... 11000 1726867158.07432: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11000 1726867158.07490: in run() - task 0affcac9-a3a5-c734-026a-000000000267 11000 1726867158.07504: variable 'ansible_search_path' from source: unknown 11000 1726867158.07508: variable 'ansible_search_path' from source: unknown 11000 1726867158.07533: calling self._execute() 11000 1726867158.07600: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.07603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.07612: variable 'omit' from source: magic vars 11000 1726867158.07860: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.07869: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.07875: _execute() done 11000 1726867158.07879: dumping result to json 11000 1726867158.07882: done dumping result, returning 11000 1726867158.07889: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-c734-026a-000000000267] 11000 1726867158.07896: sending task result for task 0affcac9-a3a5-c734-026a-000000000267 11000 1726867158.07974: done sending task result for task 0affcac9-a3a5-c734-026a-000000000267 11000 1726867158.07978: WORKER PROCESS EXITING 11000 1726867158.08004: no more pending results, returning what we have 11000 1726867158.08009: in VariableManager get_vars() 11000 1726867158.08049: Calling all_inventory to load vars for managed_node1 11000 1726867158.08052: Calling groups_inventory to load vars for managed_node1 11000 1726867158.08054: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.08063: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.08065: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.08068: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.08895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.09725: done with get_vars() 11000 1726867158.09736: variable 'ansible_search_path' from source: unknown 11000 1726867158.09737: variable 'ansible_search_path' from source: unknown 11000 1726867158.09762: we have included files to process 11000 1726867158.09763: generating all_blocks data 11000 1726867158.09764: done generating all_blocks data 11000 1726867158.09767: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867158.09767: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867158.09769: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867158.10354: done processing included file 11000 1726867158.10355: iterating over new_blocks loaded from include file 11000 1726867158.10356: in VariableManager get_vars() 11000 1726867158.10368: done with get_vars() 11000 1726867158.10369: filtering new block on tags 11000 1726867158.10385: done filtering new block on tags 11000 1726867158.10389: in VariableManager get_vars() 11000 1726867158.10400: done with get_vars() 11000 1726867158.10401: filtering new block on tags 11000 1726867158.10416: done filtering new block on tags 11000 1726867158.10417: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11000 1726867158.10421: extending task lists for all hosts with included blocks 11000 1726867158.10520: done extending task lists 11000 1726867158.10521: done processing included files 11000 1726867158.10522: results queue empty 11000 1726867158.10522: checking for any_errors_fatal 11000 1726867158.10524: done checking for any_errors_fatal 11000 1726867158.10524: checking for max_fail_percentage 11000 1726867158.10525: done checking for max_fail_percentage 11000 1726867158.10526: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.10526: done checking to see if all hosts have failed 11000 1726867158.10527: getting the remaining hosts for this loop 11000 1726867158.10528: done getting the remaining hosts for this loop 11000 1726867158.10529: getting the next task for host managed_node1 11000 1726867158.10532: done getting next task for host managed_node1 11000 1726867158.10533: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11000 1726867158.10535: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.10536: getting variables 11000 1726867158.10537: in VariableManager get_vars() 11000 1726867158.10545: Calling all_inventory to load vars for managed_node1 11000 1726867158.10547: Calling groups_inventory to load vars for managed_node1 11000 1726867158.10548: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.10551: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.10552: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.10554: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.11181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.12011: done with get_vars() 11000 1726867158.12024: done getting variables 11000 1726867158.12053: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:19:18 -0400 (0:00:00.049) 0:00:19.763 ****** 11000 1726867158.12072: entering _queue_task() for managed_node1/set_fact 11000 1726867158.12265: worker is 1 (out of 1 available) 11000 1726867158.12275: exiting _queue_task() for managed_node1/set_fact 11000 1726867158.12292: done queuing things up, now waiting for results queue to drain 11000 1726867158.12293: waiting for pending results... 11000 1726867158.12449: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11000 1726867158.12527: in run() - task 0affcac9-a3a5-c734-026a-0000000003fb 11000 1726867158.12538: variable 'ansible_search_path' from source: unknown 11000 1726867158.12541: variable 'ansible_search_path' from source: unknown 11000 1726867158.12568: calling self._execute() 11000 1726867158.12636: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.12642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.12651: variable 'omit' from source: magic vars 11000 1726867158.12907: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.12917: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.12923: variable 'omit' from source: magic vars 11000 1726867158.12953: variable 'omit' from source: magic vars 11000 1726867158.12980: variable 'omit' from source: magic vars 11000 1726867158.13011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867158.13037: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867158.13053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867158.13072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.13075: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.13102: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867158.13106: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.13108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.13169: Set connection var ansible_shell_type to sh 11000 1726867158.13176: Set connection var ansible_pipelining to False 11000 1726867158.13192: Set connection var ansible_shell_executable to /bin/sh 11000 1726867158.13196: Set connection var ansible_connection to ssh 11000 1726867158.13198: Set connection var ansible_timeout to 10 11000 1726867158.13202: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867158.13221: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.13224: variable 'ansible_connection' from source: unknown 11000 1726867158.13227: variable 'ansible_module_compression' from source: unknown 11000 1726867158.13230: variable 'ansible_shell_type' from source: unknown 11000 1726867158.13232: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.13235: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.13237: variable 'ansible_pipelining' from source: unknown 11000 1726867158.13239: variable 'ansible_timeout' from source: unknown 11000 1726867158.13242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.13341: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867158.13350: variable 'omit' from source: magic vars 11000 1726867158.13356: starting attempt loop 11000 1726867158.13358: running the handler 11000 1726867158.13368: handler run complete 11000 1726867158.13376: attempt loop complete, returning result 11000 1726867158.13381: _execute() done 11000 1726867158.13383: dumping result to json 11000 1726867158.13385: done dumping result, returning 11000 1726867158.13397: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-c734-026a-0000000003fb] 11000 1726867158.13399: sending task result for task 0affcac9-a3a5-c734-026a-0000000003fb 11000 1726867158.13468: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003fb 11000 1726867158.13471: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11000 1726867158.13556: no more pending results, returning what we have 11000 1726867158.13559: results queue empty 11000 1726867158.13559: checking for any_errors_fatal 11000 1726867158.13561: done checking for any_errors_fatal 11000 1726867158.13561: checking for max_fail_percentage 11000 1726867158.13562: done checking for max_fail_percentage 11000 1726867158.13563: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.13564: done checking to see if all hosts have failed 11000 1726867158.13565: getting the remaining hosts for this loop 11000 1726867158.13566: done getting the remaining hosts for this loop 11000 1726867158.13568: getting the next task for host managed_node1 11000 1726867158.13573: done getting next task for host managed_node1 11000 1726867158.13575: ^ task is: TASK: Stat profile file 11000 1726867158.13581: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.13584: getting variables 11000 1726867158.13585: in VariableManager get_vars() 11000 1726867158.13620: Calling all_inventory to load vars for managed_node1 11000 1726867158.13623: Calling groups_inventory to load vars for managed_node1 11000 1726867158.13625: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.13633: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.13636: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.13638: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.14382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.15386: done with get_vars() 11000 1726867158.15408: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:19:18 -0400 (0:00:00.034) 0:00:19.798 ****** 11000 1726867158.15495: entering _queue_task() for managed_node1/stat 11000 1726867158.15747: worker is 1 (out of 1 available) 11000 1726867158.15760: exiting _queue_task() for managed_node1/stat 11000 1726867158.15772: done queuing things up, now waiting for results queue to drain 11000 1726867158.15774: waiting for pending results... 11000 1726867158.16097: running TaskExecutor() for managed_node1/TASK: Stat profile file 11000 1726867158.16165: in run() - task 0affcac9-a3a5-c734-026a-0000000003fc 11000 1726867158.16193: variable 'ansible_search_path' from source: unknown 11000 1726867158.16203: variable 'ansible_search_path' from source: unknown 11000 1726867158.16284: calling self._execute() 11000 1726867158.16348: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.16362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.16381: variable 'omit' from source: magic vars 11000 1726867158.16760: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.16781: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.16982: variable 'omit' from source: magic vars 11000 1726867158.16985: variable 'omit' from source: magic vars 11000 1726867158.16990: variable 'profile' from source: include params 11000 1726867158.16992: variable 'item' from source: include params 11000 1726867158.17019: variable 'item' from source: include params 11000 1726867158.17045: variable 'omit' from source: magic vars 11000 1726867158.17094: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867158.17139: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867158.17164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867158.17194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.17217: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.17254: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867158.17264: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.17274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.17385: Set connection var ansible_shell_type to sh 11000 1726867158.17403: Set connection var ansible_pipelining to False 11000 1726867158.17419: Set connection var ansible_shell_executable to /bin/sh 11000 1726867158.17432: Set connection var ansible_connection to ssh 11000 1726867158.17446: Set connection var ansible_timeout to 10 11000 1726867158.17459: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867158.17497: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.17508: variable 'ansible_connection' from source: unknown 11000 1726867158.17517: variable 'ansible_module_compression' from source: unknown 11000 1726867158.17526: variable 'ansible_shell_type' from source: unknown 11000 1726867158.17536: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.17548: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.17586: variable 'ansible_pipelining' from source: unknown 11000 1726867158.17593: variable 'ansible_timeout' from source: unknown 11000 1726867158.17596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.17797: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867158.17869: variable 'omit' from source: magic vars 11000 1726867158.17872: starting attempt loop 11000 1726867158.17875: running the handler 11000 1726867158.17878: _low_level_execute_command(): starting 11000 1726867158.17881: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867158.18591: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867158.18639: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867158.18656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867158.18748: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867158.18766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.18868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.20545: stdout chunk (state=3): >>>/root <<< 11000 1726867158.20641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.20663: stderr chunk (state=3): >>><<< 11000 1726867158.20666: stdout chunk (state=3): >>><<< 11000 1726867158.20685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.20701: _low_level_execute_command(): starting 11000 1726867158.20706: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443 `" && echo ansible-tmp-1726867158.206849-11925-226345946031443="` echo /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443 `" ) && sleep 0' 11000 1726867158.21112: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867158.21116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867158.21119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.21129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.21173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.21181: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867158.21183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.21223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.23109: stdout chunk (state=3): >>>ansible-tmp-1726867158.206849-11925-226345946031443=/root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443 <<< 11000 1726867158.23218: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.23238: stderr chunk (state=3): >>><<< 11000 1726867158.23241: stdout chunk (state=3): >>><<< 11000 1726867158.23253: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867158.206849-11925-226345946031443=/root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.23288: variable 'ansible_module_compression' from source: unknown 11000 1726867158.23330: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11000 1726867158.23357: variable 'ansible_facts' from source: unknown 11000 1726867158.23422: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/AnsiballZ_stat.py 11000 1726867158.23516: Sending initial data 11000 1726867158.23519: Sent initial data (152 bytes) 11000 1726867158.23918: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.23922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.23934: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.23983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.23999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.24044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.25943: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11000 1726867158.25946: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867158.26014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867158.26111: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmp32wb8kig /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/AnsiballZ_stat.py <<< 11000 1726867158.26114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/AnsiballZ_stat.py" <<< 11000 1726867158.26158: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmp32wb8kig" to remote "/root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/AnsiballZ_stat.py" <<< 11000 1726867158.26726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.26767: stderr chunk (state=3): >>><<< 11000 1726867158.26770: stdout chunk (state=3): >>><<< 11000 1726867158.26813: done transferring module to remote 11000 1726867158.26821: _low_level_execute_command(): starting 11000 1726867158.26825: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/ /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/AnsiballZ_stat.py && sleep 0' 11000 1726867158.27235: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.27242: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867158.27264: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.27267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.27269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.27326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.27333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.27384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.29370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.29373: stdout chunk (state=3): >>><<< 11000 1726867158.29380: stderr chunk (state=3): >>><<< 11000 1726867158.29395: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.29398: _low_level_execute_command(): starting 11000 1726867158.29402: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/AnsiballZ_stat.py && sleep 0' 11000 1726867158.29828: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867158.29831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867158.29834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11000 1726867158.29836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.29838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.29893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.29898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.29952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.45269: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11000 1726867158.46489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867158.46520: stderr chunk (state=3): >>><<< 11000 1726867158.46524: stdout chunk (state=3): >>><<< 11000 1726867158.46537: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867158.46561: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867158.46569: _low_level_execute_command(): starting 11000 1726867158.46574: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867158.206849-11925-226345946031443/ > /dev/null 2>&1 && sleep 0' 11000 1726867158.47030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867158.47033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867158.47038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.47040: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.47042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.47098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.47101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867158.47110: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.47152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.48960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.48988: stderr chunk (state=3): >>><<< 11000 1726867158.48993: stdout chunk (state=3): >>><<< 11000 1726867158.49007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.49013: handler run complete 11000 1726867158.49028: attempt loop complete, returning result 11000 1726867158.49031: _execute() done 11000 1726867158.49033: dumping result to json 11000 1726867158.49036: done dumping result, returning 11000 1726867158.49043: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affcac9-a3a5-c734-026a-0000000003fc] 11000 1726867158.49047: sending task result for task 0affcac9-a3a5-c734-026a-0000000003fc 11000 1726867158.49134: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003fc 11000 1726867158.49137: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11000 1726867158.49201: no more pending results, returning what we have 11000 1726867158.49204: results queue empty 11000 1726867158.49205: checking for any_errors_fatal 11000 1726867158.49211: done checking for any_errors_fatal 11000 1726867158.49212: checking for max_fail_percentage 11000 1726867158.49214: done checking for max_fail_percentage 11000 1726867158.49215: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.49215: done checking to see if all hosts have failed 11000 1726867158.49216: getting the remaining hosts for this loop 11000 1726867158.49218: done getting the remaining hosts for this loop 11000 1726867158.49221: getting the next task for host managed_node1 11000 1726867158.49228: done getting next task for host managed_node1 11000 1726867158.49230: ^ task is: TASK: Set NM profile exist flag based on the profile files 11000 1726867158.49235: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.49239: getting variables 11000 1726867158.49241: in VariableManager get_vars() 11000 1726867158.49281: Calling all_inventory to load vars for managed_node1 11000 1726867158.49284: Calling groups_inventory to load vars for managed_node1 11000 1726867158.49287: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.49300: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.49302: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.49305: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.50095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.54129: done with get_vars() 11000 1726867158.54144: done getting variables 11000 1726867158.54183: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:19:18 -0400 (0:00:00.387) 0:00:20.185 ****** 11000 1726867158.54204: entering _queue_task() for managed_node1/set_fact 11000 1726867158.54457: worker is 1 (out of 1 available) 11000 1726867158.54470: exiting _queue_task() for managed_node1/set_fact 11000 1726867158.54485: done queuing things up, now waiting for results queue to drain 11000 1726867158.54486: waiting for pending results... 11000 1726867158.54652: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11000 1726867158.54721: in run() - task 0affcac9-a3a5-c734-026a-0000000003fd 11000 1726867158.54731: variable 'ansible_search_path' from source: unknown 11000 1726867158.54735: variable 'ansible_search_path' from source: unknown 11000 1726867158.54762: calling self._execute() 11000 1726867158.54839: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.54845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.54854: variable 'omit' from source: magic vars 11000 1726867158.55129: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.55139: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.55223: variable 'profile_stat' from source: set_fact 11000 1726867158.55235: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867158.55238: when evaluation is False, skipping this task 11000 1726867158.55241: _execute() done 11000 1726867158.55243: dumping result to json 11000 1726867158.55246: done dumping result, returning 11000 1726867158.55254: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-c734-026a-0000000003fd] 11000 1726867158.55257: sending task result for task 0affcac9-a3a5-c734-026a-0000000003fd 11000 1726867158.55336: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003fd 11000 1726867158.55339: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867158.55412: no more pending results, returning what we have 11000 1726867158.55415: results queue empty 11000 1726867158.55415: checking for any_errors_fatal 11000 1726867158.55422: done checking for any_errors_fatal 11000 1726867158.55423: checking for max_fail_percentage 11000 1726867158.55424: done checking for max_fail_percentage 11000 1726867158.55425: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.55426: done checking to see if all hosts have failed 11000 1726867158.55427: getting the remaining hosts for this loop 11000 1726867158.55428: done getting the remaining hosts for this loop 11000 1726867158.55431: getting the next task for host managed_node1 11000 1726867158.55437: done getting next task for host managed_node1 11000 1726867158.55439: ^ task is: TASK: Get NM profile info 11000 1726867158.55443: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.55447: getting variables 11000 1726867158.55448: in VariableManager get_vars() 11000 1726867158.55481: Calling all_inventory to load vars for managed_node1 11000 1726867158.55484: Calling groups_inventory to load vars for managed_node1 11000 1726867158.55486: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.55496: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.55499: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.55501: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.56222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.57078: done with get_vars() 11000 1726867158.57094: done getting variables 11000 1726867158.57132: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:19:18 -0400 (0:00:00.029) 0:00:20.214 ****** 11000 1726867158.57153: entering _queue_task() for managed_node1/shell 11000 1726867158.57371: worker is 1 (out of 1 available) 11000 1726867158.57384: exiting _queue_task() for managed_node1/shell 11000 1726867158.57399: done queuing things up, now waiting for results queue to drain 11000 1726867158.57400: waiting for pending results... 11000 1726867158.57551: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11000 1726867158.57622: in run() - task 0affcac9-a3a5-c734-026a-0000000003fe 11000 1726867158.57635: variable 'ansible_search_path' from source: unknown 11000 1726867158.57638: variable 'ansible_search_path' from source: unknown 11000 1726867158.57662: calling self._execute() 11000 1726867158.57729: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.57735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.57745: variable 'omit' from source: magic vars 11000 1726867158.58003: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.58013: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.58019: variable 'omit' from source: magic vars 11000 1726867158.58053: variable 'omit' from source: magic vars 11000 1726867158.58123: variable 'profile' from source: include params 11000 1726867158.58127: variable 'item' from source: include params 11000 1726867158.58170: variable 'item' from source: include params 11000 1726867158.58191: variable 'omit' from source: magic vars 11000 1726867158.58220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867158.58244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867158.58260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867158.58273: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.58290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.58310: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867158.58314: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.58316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.58376: Set connection var ansible_shell_type to sh 11000 1726867158.58385: Set connection var ansible_pipelining to False 11000 1726867158.58394: Set connection var ansible_shell_executable to /bin/sh 11000 1726867158.58397: Set connection var ansible_connection to ssh 11000 1726867158.58407: Set connection var ansible_timeout to 10 11000 1726867158.58409: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867158.58427: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.58430: variable 'ansible_connection' from source: unknown 11000 1726867158.58433: variable 'ansible_module_compression' from source: unknown 11000 1726867158.58436: variable 'ansible_shell_type' from source: unknown 11000 1726867158.58438: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.58440: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.58444: variable 'ansible_pipelining' from source: unknown 11000 1726867158.58447: variable 'ansible_timeout' from source: unknown 11000 1726867158.58450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.58550: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867158.58559: variable 'omit' from source: magic vars 11000 1726867158.58563: starting attempt loop 11000 1726867158.58566: running the handler 11000 1726867158.58575: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867158.58593: _low_level_execute_command(): starting 11000 1726867158.58596: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867158.59114: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.59119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867158.59122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.59124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.59170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.59173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867158.59175: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.59235: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.60905: stdout chunk (state=3): >>>/root <<< 11000 1726867158.61005: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.61029: stderr chunk (state=3): >>><<< 11000 1726867158.61033: stdout chunk (state=3): >>><<< 11000 1726867158.61055: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.61065: _low_level_execute_command(): starting 11000 1726867158.61070: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882 `" && echo ansible-tmp-1726867158.6105413-11939-178581863830882="` echo /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882 `" ) && sleep 0' 11000 1726867158.61468: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.61482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867158.61507: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.61510: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.61512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.61564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.61570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867158.61572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.61617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.63514: stdout chunk (state=3): >>>ansible-tmp-1726867158.6105413-11939-178581863830882=/root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882 <<< 11000 1726867158.63616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.63642: stderr chunk (state=3): >>><<< 11000 1726867158.63645: stdout chunk (state=3): >>><<< 11000 1726867158.63658: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867158.6105413-11939-178581863830882=/root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.63683: variable 'ansible_module_compression' from source: unknown 11000 1726867158.63719: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867158.63750: variable 'ansible_facts' from source: unknown 11000 1726867158.63806: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/AnsiballZ_command.py 11000 1726867158.63896: Sending initial data 11000 1726867158.63899: Sent initial data (156 bytes) 11000 1726867158.64319: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867158.64323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.64325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.64327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.64370: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.64373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.64426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.65969: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11000 1726867158.65972: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867158.66012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867158.66058: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpscrslo74 /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/AnsiballZ_command.py <<< 11000 1726867158.66064: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/AnsiballZ_command.py" <<< 11000 1726867158.66105: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11000 1726867158.66108: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpscrslo74" to remote "/root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/AnsiballZ_command.py" <<< 11000 1726867158.66636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.66669: stderr chunk (state=3): >>><<< 11000 1726867158.66673: stdout chunk (state=3): >>><<< 11000 1726867158.66720: done transferring module to remote 11000 1726867158.66727: _low_level_execute_command(): starting 11000 1726867158.66732: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/ /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/AnsiballZ_command.py && sleep 0' 11000 1726867158.67132: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.67138: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.67140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867158.67142: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.67195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.67201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.67248: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.69000: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.69022: stderr chunk (state=3): >>><<< 11000 1726867158.69025: stdout chunk (state=3): >>><<< 11000 1726867158.69037: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.69040: _low_level_execute_command(): starting 11000 1726867158.69044: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/AnsiballZ_command.py && sleep 0' 11000 1726867158.69451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.69454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.69456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867158.69458: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867158.69461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.69510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.69518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867158.69521: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.69565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.86833: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 17:19:18.845601", "end": "2024-09-20 17:19:18.866494", "delta": "0:00:00.020893", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867158.88437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867158.88468: stderr chunk (state=3): >>><<< 11000 1726867158.88471: stdout chunk (state=3): >>><<< 11000 1726867158.88489: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 17:19:18.845601", "end": "2024-09-20 17:19:18.866494", "delta": "0:00:00.020893", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867158.88519: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867158.88527: _low_level_execute_command(): starting 11000 1726867158.88531: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867158.6105413-11939-178581863830882/ > /dev/null 2>&1 && sleep 0' 11000 1726867158.88998: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867158.89001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867158.89004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11000 1726867158.89006: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867158.89008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867158.89067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867158.89070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867158.89072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867158.89114: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867158.90910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867158.90935: stderr chunk (state=3): >>><<< 11000 1726867158.90938: stdout chunk (state=3): >>><<< 11000 1726867158.90954: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867158.90960: handler run complete 11000 1726867158.90976: Evaluated conditional (False): False 11000 1726867158.90988: attempt loop complete, returning result 11000 1726867158.90993: _execute() done 11000 1726867158.90995: dumping result to json 11000 1726867158.91001: done dumping result, returning 11000 1726867158.91008: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affcac9-a3a5-c734-026a-0000000003fe] 11000 1726867158.91012: sending task result for task 0affcac9-a3a5-c734-026a-0000000003fe 11000 1726867158.91110: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003fe 11000 1726867158.91114: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020893", "end": "2024-09-20 17:19:18.866494", "rc": 0, "start": "2024-09-20 17:19:18.845601" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11000 1726867158.91184: no more pending results, returning what we have 11000 1726867158.91187: results queue empty 11000 1726867158.91188: checking for any_errors_fatal 11000 1726867158.91193: done checking for any_errors_fatal 11000 1726867158.91193: checking for max_fail_percentage 11000 1726867158.91195: done checking for max_fail_percentage 11000 1726867158.91196: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.91197: done checking to see if all hosts have failed 11000 1726867158.91197: getting the remaining hosts for this loop 11000 1726867158.91199: done getting the remaining hosts for this loop 11000 1726867158.91202: getting the next task for host managed_node1 11000 1726867158.91209: done getting next task for host managed_node1 11000 1726867158.91211: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11000 1726867158.91216: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.91219: getting variables 11000 1726867158.91221: in VariableManager get_vars() 11000 1726867158.91267: Calling all_inventory to load vars for managed_node1 11000 1726867158.91270: Calling groups_inventory to load vars for managed_node1 11000 1726867158.91272: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.91284: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.91286: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.91289: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.92196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.93049: done with get_vars() 11000 1726867158.93063: done getting variables 11000 1726867158.93108: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:19:18 -0400 (0:00:00.359) 0:00:20.574 ****** 11000 1726867158.93130: entering _queue_task() for managed_node1/set_fact 11000 1726867158.93345: worker is 1 (out of 1 available) 11000 1726867158.93356: exiting _queue_task() for managed_node1/set_fact 11000 1726867158.93369: done queuing things up, now waiting for results queue to drain 11000 1726867158.93370: waiting for pending results... 11000 1726867158.93539: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11000 1726867158.93612: in run() - task 0affcac9-a3a5-c734-026a-0000000003ff 11000 1726867158.93626: variable 'ansible_search_path' from source: unknown 11000 1726867158.93629: variable 'ansible_search_path' from source: unknown 11000 1726867158.93656: calling self._execute() 11000 1726867158.93736: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.93739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.93748: variable 'omit' from source: magic vars 11000 1726867158.94025: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.94035: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.94285: variable 'nm_profile_exists' from source: set_fact 11000 1726867158.94288: Evaluated conditional (nm_profile_exists.rc == 0): True 11000 1726867158.94291: variable 'omit' from source: magic vars 11000 1726867158.94293: variable 'omit' from source: magic vars 11000 1726867158.94295: variable 'omit' from source: magic vars 11000 1726867158.94297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867158.94322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867158.94346: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867158.94369: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.94391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867158.94426: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867158.94437: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.94446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.94548: Set connection var ansible_shell_type to sh 11000 1726867158.94562: Set connection var ansible_pipelining to False 11000 1726867158.94574: Set connection var ansible_shell_executable to /bin/sh 11000 1726867158.94583: Set connection var ansible_connection to ssh 11000 1726867158.94596: Set connection var ansible_timeout to 10 11000 1726867158.94605: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867158.94634: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.94641: variable 'ansible_connection' from source: unknown 11000 1726867158.94647: variable 'ansible_module_compression' from source: unknown 11000 1726867158.94653: variable 'ansible_shell_type' from source: unknown 11000 1726867158.94659: variable 'ansible_shell_executable' from source: unknown 11000 1726867158.94664: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.94671: variable 'ansible_pipelining' from source: unknown 11000 1726867158.94680: variable 'ansible_timeout' from source: unknown 11000 1726867158.94692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.94829: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867158.94848: variable 'omit' from source: magic vars 11000 1726867158.94881: starting attempt loop 11000 1726867158.94885: running the handler 11000 1726867158.94889: handler run complete 11000 1726867158.94895: attempt loop complete, returning result 11000 1726867158.94901: _execute() done 11000 1726867158.94908: dumping result to json 11000 1726867158.95083: done dumping result, returning 11000 1726867158.95086: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-c734-026a-0000000003ff] 11000 1726867158.95092: sending task result for task 0affcac9-a3a5-c734-026a-0000000003ff 11000 1726867158.95151: done sending task result for task 0affcac9-a3a5-c734-026a-0000000003ff 11000 1726867158.95155: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11000 1726867158.95214: no more pending results, returning what we have 11000 1726867158.95217: results queue empty 11000 1726867158.95218: checking for any_errors_fatal 11000 1726867158.95222: done checking for any_errors_fatal 11000 1726867158.95223: checking for max_fail_percentage 11000 1726867158.95224: done checking for max_fail_percentage 11000 1726867158.95224: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.95225: done checking to see if all hosts have failed 11000 1726867158.95226: getting the remaining hosts for this loop 11000 1726867158.95227: done getting the remaining hosts for this loop 11000 1726867158.95230: getting the next task for host managed_node1 11000 1726867158.95238: done getting next task for host managed_node1 11000 1726867158.95240: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11000 1726867158.95243: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.95246: getting variables 11000 1726867158.95247: in VariableManager get_vars() 11000 1726867158.95278: Calling all_inventory to load vars for managed_node1 11000 1726867158.95281: Calling groups_inventory to load vars for managed_node1 11000 1726867158.95283: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.95291: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.95293: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.95296: Calling groups_plugins_play to load vars for managed_node1 11000 1726867158.96026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867158.96879: done with get_vars() 11000 1726867158.96893: done getting variables 11000 1726867158.96931: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867158.97012: variable 'profile' from source: include params 11000 1726867158.97015: variable 'item' from source: include params 11000 1726867158.97055: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:19:18 -0400 (0:00:00.039) 0:00:20.613 ****** 11000 1726867158.97083: entering _queue_task() for managed_node1/command 11000 1726867158.97297: worker is 1 (out of 1 available) 11000 1726867158.97308: exiting _queue_task() for managed_node1/command 11000 1726867158.97318: done queuing things up, now waiting for results queue to drain 11000 1726867158.97320: waiting for pending results... 11000 1726867158.97694: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11000 1726867158.97698: in run() - task 0affcac9-a3a5-c734-026a-000000000401 11000 1726867158.97702: variable 'ansible_search_path' from source: unknown 11000 1726867158.97705: variable 'ansible_search_path' from source: unknown 11000 1726867158.97742: calling self._execute() 11000 1726867158.97840: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867158.97852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867158.97866: variable 'omit' from source: magic vars 11000 1726867158.98230: variable 'ansible_distribution_major_version' from source: facts 11000 1726867158.98247: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867158.98371: variable 'profile_stat' from source: set_fact 11000 1726867158.98391: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867158.98397: when evaluation is False, skipping this task 11000 1726867158.98402: _execute() done 11000 1726867158.98407: dumping result to json 11000 1726867158.98412: done dumping result, returning 11000 1726867158.98420: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [0affcac9-a3a5-c734-026a-000000000401] 11000 1726867158.98427: sending task result for task 0affcac9-a3a5-c734-026a-000000000401 11000 1726867158.98517: done sending task result for task 0affcac9-a3a5-c734-026a-000000000401 11000 1726867158.98524: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867158.98592: no more pending results, returning what we have 11000 1726867158.98595: results queue empty 11000 1726867158.98596: checking for any_errors_fatal 11000 1726867158.98601: done checking for any_errors_fatal 11000 1726867158.98602: checking for max_fail_percentage 11000 1726867158.98603: done checking for max_fail_percentage 11000 1726867158.98604: checking to see if all hosts have failed and the running result is not ok 11000 1726867158.98605: done checking to see if all hosts have failed 11000 1726867158.98606: getting the remaining hosts for this loop 11000 1726867158.98607: done getting the remaining hosts for this loop 11000 1726867158.98610: getting the next task for host managed_node1 11000 1726867158.98618: done getting next task for host managed_node1 11000 1726867158.98620: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11000 1726867158.98624: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867158.98628: getting variables 11000 1726867158.98630: in VariableManager get_vars() 11000 1726867158.98669: Calling all_inventory to load vars for managed_node1 11000 1726867158.98672: Calling groups_inventory to load vars for managed_node1 11000 1726867158.98675: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867158.98689: Calling all_plugins_play to load vars for managed_node1 11000 1726867158.98693: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867158.98696: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.00205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.01665: done with get_vars() 11000 1726867159.01686: done getting variables 11000 1726867159.01734: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867159.01835: variable 'profile' from source: include params 11000 1726867159.01839: variable 'item' from source: include params 11000 1726867159.01898: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:19:19 -0400 (0:00:00.048) 0:00:20.662 ****** 11000 1726867159.01930: entering _queue_task() for managed_node1/set_fact 11000 1726867159.02178: worker is 1 (out of 1 available) 11000 1726867159.02189: exiting _queue_task() for managed_node1/set_fact 11000 1726867159.02200: done queuing things up, now waiting for results queue to drain 11000 1726867159.02201: waiting for pending results... 11000 1726867159.02595: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11000 1726867159.02599: in run() - task 0affcac9-a3a5-c734-026a-000000000402 11000 1726867159.02601: variable 'ansible_search_path' from source: unknown 11000 1726867159.02603: variable 'ansible_search_path' from source: unknown 11000 1726867159.02629: calling self._execute() 11000 1726867159.02728: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.02743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.02759: variable 'omit' from source: magic vars 11000 1726867159.03136: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.03154: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.03282: variable 'profile_stat' from source: set_fact 11000 1726867159.03302: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867159.03310: when evaluation is False, skipping this task 11000 1726867159.03317: _execute() done 11000 1726867159.03326: dumping result to json 11000 1726867159.03333: done dumping result, returning 11000 1726867159.03349: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [0affcac9-a3a5-c734-026a-000000000402] 11000 1726867159.03359: sending task result for task 0affcac9-a3a5-c734-026a-000000000402 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867159.03503: no more pending results, returning what we have 11000 1726867159.03506: results queue empty 11000 1726867159.03508: checking for any_errors_fatal 11000 1726867159.03514: done checking for any_errors_fatal 11000 1726867159.03515: checking for max_fail_percentage 11000 1726867159.03517: done checking for max_fail_percentage 11000 1726867159.03518: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.03519: done checking to see if all hosts have failed 11000 1726867159.03519: getting the remaining hosts for this loop 11000 1726867159.03521: done getting the remaining hosts for this loop 11000 1726867159.03524: getting the next task for host managed_node1 11000 1726867159.03532: done getting next task for host managed_node1 11000 1726867159.03534: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11000 1726867159.03538: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.03544: getting variables 11000 1726867159.03545: in VariableManager get_vars() 11000 1726867159.03589: Calling all_inventory to load vars for managed_node1 11000 1726867159.03592: Calling groups_inventory to load vars for managed_node1 11000 1726867159.03595: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.03608: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.03612: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.03615: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.04390: done sending task result for task 0affcac9-a3a5-c734-026a-000000000402 11000 1726867159.04393: WORKER PROCESS EXITING 11000 1726867159.05075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.06560: done with get_vars() 11000 1726867159.06581: done getting variables 11000 1726867159.06636: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867159.06738: variable 'profile' from source: include params 11000 1726867159.06742: variable 'item' from source: include params 11000 1726867159.06799: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:19:19 -0400 (0:00:00.048) 0:00:20.711 ****** 11000 1726867159.06831: entering _queue_task() for managed_node1/command 11000 1726867159.07092: worker is 1 (out of 1 available) 11000 1726867159.07103: exiting _queue_task() for managed_node1/command 11000 1726867159.07115: done queuing things up, now waiting for results queue to drain 11000 1726867159.07116: waiting for pending results... 11000 1726867159.07376: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 11000 1726867159.07505: in run() - task 0affcac9-a3a5-c734-026a-000000000403 11000 1726867159.07523: variable 'ansible_search_path' from source: unknown 11000 1726867159.07530: variable 'ansible_search_path' from source: unknown 11000 1726867159.07569: calling self._execute() 11000 1726867159.07669: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.07684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.07699: variable 'omit' from source: magic vars 11000 1726867159.08066: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.08085: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.08208: variable 'profile_stat' from source: set_fact 11000 1726867159.08227: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867159.08235: when evaluation is False, skipping this task 11000 1726867159.08242: _execute() done 11000 1726867159.08249: dumping result to json 11000 1726867159.08262: done dumping result, returning 11000 1726867159.08273: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [0affcac9-a3a5-c734-026a-000000000403] 11000 1726867159.08285: sending task result for task 0affcac9-a3a5-c734-026a-000000000403 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867159.08423: no more pending results, returning what we have 11000 1726867159.08427: results queue empty 11000 1726867159.08429: checking for any_errors_fatal 11000 1726867159.08437: done checking for any_errors_fatal 11000 1726867159.08438: checking for max_fail_percentage 11000 1726867159.08440: done checking for max_fail_percentage 11000 1726867159.08441: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.08441: done checking to see if all hosts have failed 11000 1726867159.08442: getting the remaining hosts for this loop 11000 1726867159.08443: done getting the remaining hosts for this loop 11000 1726867159.08447: getting the next task for host managed_node1 11000 1726867159.08454: done getting next task for host managed_node1 11000 1726867159.08456: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11000 1726867159.08461: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.08466: getting variables 11000 1726867159.08468: in VariableManager get_vars() 11000 1726867159.08510: Calling all_inventory to load vars for managed_node1 11000 1726867159.08513: Calling groups_inventory to load vars for managed_node1 11000 1726867159.08516: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.08529: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.08533: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.08536: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.09290: done sending task result for task 0affcac9-a3a5-c734-026a-000000000403 11000 1726867159.09293: WORKER PROCESS EXITING 11000 1726867159.10069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.11112: done with get_vars() 11000 1726867159.11131: done getting variables 11000 1726867159.11191: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867159.11294: variable 'profile' from source: include params 11000 1726867159.11298: variable 'item' from source: include params 11000 1726867159.11355: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:19:19 -0400 (0:00:00.045) 0:00:20.756 ****** 11000 1726867159.11389: entering _queue_task() for managed_node1/set_fact 11000 1726867159.11653: worker is 1 (out of 1 available) 11000 1726867159.11663: exiting _queue_task() for managed_node1/set_fact 11000 1726867159.11674: done queuing things up, now waiting for results queue to drain 11000 1726867159.11676: waiting for pending results... 11000 1726867159.11936: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11000 1726867159.12065: in run() - task 0affcac9-a3a5-c734-026a-000000000404 11000 1726867159.12102: variable 'ansible_search_path' from source: unknown 11000 1726867159.12105: variable 'ansible_search_path' from source: unknown 11000 1726867159.12127: calling self._execute() 11000 1726867159.12238: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.12242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.12245: variable 'omit' from source: magic vars 11000 1726867159.12905: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.12982: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.13031: variable 'profile_stat' from source: set_fact 11000 1726867159.13051: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867159.13063: when evaluation is False, skipping this task 11000 1726867159.13069: _execute() done 11000 1726867159.13079: dumping result to json 11000 1726867159.13090: done dumping result, returning 11000 1726867159.13100: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [0affcac9-a3a5-c734-026a-000000000404] 11000 1726867159.13108: sending task result for task 0affcac9-a3a5-c734-026a-000000000404 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867159.13239: no more pending results, returning what we have 11000 1726867159.13243: results queue empty 11000 1726867159.13244: checking for any_errors_fatal 11000 1726867159.13248: done checking for any_errors_fatal 11000 1726867159.13249: checking for max_fail_percentage 11000 1726867159.13251: done checking for max_fail_percentage 11000 1726867159.13252: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.13252: done checking to see if all hosts have failed 11000 1726867159.13253: getting the remaining hosts for this loop 11000 1726867159.13254: done getting the remaining hosts for this loop 11000 1726867159.13257: getting the next task for host managed_node1 11000 1726867159.13264: done getting next task for host managed_node1 11000 1726867159.13267: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11000 1726867159.13270: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.13275: getting variables 11000 1726867159.13276: in VariableManager get_vars() 11000 1726867159.13470: Calling all_inventory to load vars for managed_node1 11000 1726867159.13473: Calling groups_inventory to load vars for managed_node1 11000 1726867159.13476: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.13484: done sending task result for task 0affcac9-a3a5-c734-026a-000000000404 11000 1726867159.13487: WORKER PROCESS EXITING 11000 1726867159.13499: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.13502: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.13505: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.15635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.16772: done with get_vars() 11000 1726867159.16790: done getting variables 11000 1726867159.16834: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867159.16916: variable 'profile' from source: include params 11000 1726867159.16919: variable 'item' from source: include params 11000 1726867159.16957: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:19:19 -0400 (0:00:00.055) 0:00:20.812 ****** 11000 1726867159.16980: entering _queue_task() for managed_node1/assert 11000 1726867159.17180: worker is 1 (out of 1 available) 11000 1726867159.17201: exiting _queue_task() for managed_node1/assert 11000 1726867159.17213: done queuing things up, now waiting for results queue to drain 11000 1726867159.17214: waiting for pending results... 11000 1726867159.17454: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' 11000 1726867159.17684: in run() - task 0affcac9-a3a5-c734-026a-000000000268 11000 1726867159.17690: variable 'ansible_search_path' from source: unknown 11000 1726867159.17694: variable 'ansible_search_path' from source: unknown 11000 1726867159.18090: calling self._execute() 11000 1726867159.18198: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.18202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.18204: variable 'omit' from source: magic vars 11000 1726867159.18832: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.18854: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.18904: variable 'omit' from source: magic vars 11000 1726867159.18945: variable 'omit' from source: magic vars 11000 1726867159.19154: variable 'profile' from source: include params 11000 1726867159.19198: variable 'item' from source: include params 11000 1726867159.19273: variable 'item' from source: include params 11000 1726867159.19336: variable 'omit' from source: magic vars 11000 1726867159.19425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867159.19491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867159.19522: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867159.19548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.19583: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.19637: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867159.19646: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.19657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.19765: Set connection var ansible_shell_type to sh 11000 1726867159.19782: Set connection var ansible_pipelining to False 11000 1726867159.19798: Set connection var ansible_shell_executable to /bin/sh 11000 1726867159.19805: Set connection var ansible_connection to ssh 11000 1726867159.19814: Set connection var ansible_timeout to 10 11000 1726867159.19824: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867159.19858: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.20056: variable 'ansible_connection' from source: unknown 11000 1726867159.20059: variable 'ansible_module_compression' from source: unknown 11000 1726867159.20062: variable 'ansible_shell_type' from source: unknown 11000 1726867159.20064: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.20067: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.20068: variable 'ansible_pipelining' from source: unknown 11000 1726867159.20071: variable 'ansible_timeout' from source: unknown 11000 1726867159.20073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.20262: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867159.20291: variable 'omit' from source: magic vars 11000 1726867159.20302: starting attempt loop 11000 1726867159.20313: running the handler 11000 1726867159.20425: variable 'lsr_net_profile_exists' from source: set_fact 11000 1726867159.20436: Evaluated conditional (lsr_net_profile_exists): True 11000 1726867159.20445: handler run complete 11000 1726867159.20462: attempt loop complete, returning result 11000 1726867159.20469: _execute() done 11000 1726867159.20475: dumping result to json 11000 1726867159.20493: done dumping result, returning 11000 1726867159.20506: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' [0affcac9-a3a5-c734-026a-000000000268] 11000 1726867159.20514: sending task result for task 0affcac9-a3a5-c734-026a-000000000268 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867159.20754: no more pending results, returning what we have 11000 1726867159.20757: results queue empty 11000 1726867159.20758: checking for any_errors_fatal 11000 1726867159.20765: done checking for any_errors_fatal 11000 1726867159.20765: checking for max_fail_percentage 11000 1726867159.20767: done checking for max_fail_percentage 11000 1726867159.20768: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.20769: done checking to see if all hosts have failed 11000 1726867159.20769: getting the remaining hosts for this loop 11000 1726867159.20771: done getting the remaining hosts for this loop 11000 1726867159.20774: getting the next task for host managed_node1 11000 1726867159.20782: done getting next task for host managed_node1 11000 1726867159.20785: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11000 1726867159.20791: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.20796: getting variables 11000 1726867159.20797: in VariableManager get_vars() 11000 1726867159.20846: Calling all_inventory to load vars for managed_node1 11000 1726867159.20849: Calling groups_inventory to load vars for managed_node1 11000 1726867159.20851: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.20863: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.20866: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.20869: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.21434: done sending task result for task 0affcac9-a3a5-c734-026a-000000000268 11000 1726867159.21437: WORKER PROCESS EXITING 11000 1726867159.23260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.25244: done with get_vars() 11000 1726867159.25268: done getting variables 11000 1726867159.25358: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867159.25685: variable 'profile' from source: include params 11000 1726867159.25693: variable 'item' from source: include params 11000 1726867159.25757: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:19:19 -0400 (0:00:00.089) 0:00:20.902 ****** 11000 1726867159.25909: entering _queue_task() for managed_node1/assert 11000 1726867159.26416: worker is 1 (out of 1 available) 11000 1726867159.26429: exiting _queue_task() for managed_node1/assert 11000 1726867159.26441: done queuing things up, now waiting for results queue to drain 11000 1726867159.26443: waiting for pending results... 11000 1726867159.26747: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11000 1726867159.26860: in run() - task 0affcac9-a3a5-c734-026a-000000000269 11000 1726867159.26892: variable 'ansible_search_path' from source: unknown 11000 1726867159.26903: variable 'ansible_search_path' from source: unknown 11000 1726867159.26998: calling self._execute() 11000 1726867159.27105: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.27109: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.27114: variable 'omit' from source: magic vars 11000 1726867159.27399: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.27408: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.27414: variable 'omit' from source: magic vars 11000 1726867159.27447: variable 'omit' from source: magic vars 11000 1726867159.27517: variable 'profile' from source: include params 11000 1726867159.27520: variable 'item' from source: include params 11000 1726867159.27568: variable 'item' from source: include params 11000 1726867159.27585: variable 'omit' from source: magic vars 11000 1726867159.27617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867159.27649: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867159.27661: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867159.27675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.27687: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.27713: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867159.27716: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.27718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.27786: Set connection var ansible_shell_type to sh 11000 1726867159.27796: Set connection var ansible_pipelining to False 11000 1726867159.27803: Set connection var ansible_shell_executable to /bin/sh 11000 1726867159.27806: Set connection var ansible_connection to ssh 11000 1726867159.27811: Set connection var ansible_timeout to 10 11000 1726867159.27815: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867159.27836: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.27839: variable 'ansible_connection' from source: unknown 11000 1726867159.27841: variable 'ansible_module_compression' from source: unknown 11000 1726867159.27843: variable 'ansible_shell_type' from source: unknown 11000 1726867159.27845: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.27848: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.27852: variable 'ansible_pipelining' from source: unknown 11000 1726867159.27855: variable 'ansible_timeout' from source: unknown 11000 1726867159.27857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.27958: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867159.27969: variable 'omit' from source: magic vars 11000 1726867159.27972: starting attempt loop 11000 1726867159.27975: running the handler 11000 1726867159.28051: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11000 1726867159.28054: Evaluated conditional (lsr_net_profile_ansible_managed): True 11000 1726867159.28060: handler run complete 11000 1726867159.28071: attempt loop complete, returning result 11000 1726867159.28074: _execute() done 11000 1726867159.28076: dumping result to json 11000 1726867159.28081: done dumping result, returning 11000 1726867159.28093: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [0affcac9-a3a5-c734-026a-000000000269] 11000 1726867159.28096: sending task result for task 0affcac9-a3a5-c734-026a-000000000269 11000 1726867159.28168: done sending task result for task 0affcac9-a3a5-c734-026a-000000000269 11000 1726867159.28170: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867159.28234: no more pending results, returning what we have 11000 1726867159.28237: results queue empty 11000 1726867159.28238: checking for any_errors_fatal 11000 1726867159.28243: done checking for any_errors_fatal 11000 1726867159.28244: checking for max_fail_percentage 11000 1726867159.28245: done checking for max_fail_percentage 11000 1726867159.28247: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.28247: done checking to see if all hosts have failed 11000 1726867159.28248: getting the remaining hosts for this loop 11000 1726867159.28249: done getting the remaining hosts for this loop 11000 1726867159.28252: getting the next task for host managed_node1 11000 1726867159.28257: done getting next task for host managed_node1 11000 1726867159.28260: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11000 1726867159.28263: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.28266: getting variables 11000 1726867159.28267: in VariableManager get_vars() 11000 1726867159.28303: Calling all_inventory to load vars for managed_node1 11000 1726867159.28305: Calling groups_inventory to load vars for managed_node1 11000 1726867159.28308: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.28317: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.28319: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.28322: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.29475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.30718: done with get_vars() 11000 1726867159.30733: done getting variables 11000 1726867159.30770: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867159.30846: variable 'profile' from source: include params 11000 1726867159.30849: variable 'item' from source: include params 11000 1726867159.30889: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:19:19 -0400 (0:00:00.050) 0:00:20.952 ****** 11000 1726867159.30914: entering _queue_task() for managed_node1/assert 11000 1726867159.31119: worker is 1 (out of 1 available) 11000 1726867159.31131: exiting _queue_task() for managed_node1/assert 11000 1726867159.31143: done queuing things up, now waiting for results queue to drain 11000 1726867159.31145: waiting for pending results... 11000 1726867159.31306: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 11000 1726867159.31375: in run() - task 0affcac9-a3a5-c734-026a-00000000026a 11000 1726867159.31381: variable 'ansible_search_path' from source: unknown 11000 1726867159.31384: variable 'ansible_search_path' from source: unknown 11000 1726867159.31416: calling self._execute() 11000 1726867159.31490: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.31498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.31506: variable 'omit' from source: magic vars 11000 1726867159.31757: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.31767: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.31773: variable 'omit' from source: magic vars 11000 1726867159.31805: variable 'omit' from source: magic vars 11000 1726867159.31870: variable 'profile' from source: include params 11000 1726867159.31874: variable 'item' from source: include params 11000 1726867159.31924: variable 'item' from source: include params 11000 1726867159.31941: variable 'omit' from source: magic vars 11000 1726867159.31970: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867159.32000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867159.32018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867159.32031: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.32042: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.32064: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867159.32067: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.32070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.32184: Set connection var ansible_shell_type to sh 11000 1726867159.32187: Set connection var ansible_pipelining to False 11000 1726867159.32265: Set connection var ansible_shell_executable to /bin/sh 11000 1726867159.32268: Set connection var ansible_connection to ssh 11000 1726867159.32270: Set connection var ansible_timeout to 10 11000 1726867159.32272: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867159.32275: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.32280: variable 'ansible_connection' from source: unknown 11000 1726867159.32282: variable 'ansible_module_compression' from source: unknown 11000 1726867159.32284: variable 'ansible_shell_type' from source: unknown 11000 1726867159.32287: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.32289: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.32291: variable 'ansible_pipelining' from source: unknown 11000 1726867159.32294: variable 'ansible_timeout' from source: unknown 11000 1726867159.32296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.32456: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867159.32525: variable 'omit' from source: magic vars 11000 1726867159.32528: starting attempt loop 11000 1726867159.32530: running the handler 11000 1726867159.32598: variable 'lsr_net_profile_fingerprint' from source: set_fact 11000 1726867159.32608: Evaluated conditional (lsr_net_profile_fingerprint): True 11000 1726867159.32617: handler run complete 11000 1726867159.32643: attempt loop complete, returning result 11000 1726867159.32649: _execute() done 11000 1726867159.32654: dumping result to json 11000 1726867159.32661: done dumping result, returning 11000 1726867159.32672: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 [0affcac9-a3a5-c734-026a-00000000026a] 11000 1726867159.32682: sending task result for task 0affcac9-a3a5-c734-026a-00000000026a ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867159.32814: no more pending results, returning what we have 11000 1726867159.32818: results queue empty 11000 1726867159.32819: checking for any_errors_fatal 11000 1726867159.32824: done checking for any_errors_fatal 11000 1726867159.32825: checking for max_fail_percentage 11000 1726867159.32826: done checking for max_fail_percentage 11000 1726867159.32827: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.32828: done checking to see if all hosts have failed 11000 1726867159.32829: getting the remaining hosts for this loop 11000 1726867159.32830: done getting the remaining hosts for this loop 11000 1726867159.32834: getting the next task for host managed_node1 11000 1726867159.32843: done getting next task for host managed_node1 11000 1726867159.32847: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11000 1726867159.32850: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.32855: getting variables 11000 1726867159.32857: in VariableManager get_vars() 11000 1726867159.32899: Calling all_inventory to load vars for managed_node1 11000 1726867159.32902: Calling groups_inventory to load vars for managed_node1 11000 1726867159.32909: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.32922: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.32925: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.32928: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.33791: done sending task result for task 0affcac9-a3a5-c734-026a-00000000026a 11000 1726867159.33795: WORKER PROCESS EXITING 11000 1726867159.33805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.34652: done with get_vars() 11000 1726867159.34666: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 17:19:19 -0400 (0:00:00.038) 0:00:20.990 ****** 11000 1726867159.34731: entering _queue_task() for managed_node1/include_tasks 11000 1726867159.34934: worker is 1 (out of 1 available) 11000 1726867159.34946: exiting _queue_task() for managed_node1/include_tasks 11000 1726867159.34957: done queuing things up, now waiting for results queue to drain 11000 1726867159.34958: waiting for pending results... 11000 1726867159.35120: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11000 1726867159.35195: in run() - task 0affcac9-a3a5-c734-026a-00000000026e 11000 1726867159.35204: variable 'ansible_search_path' from source: unknown 11000 1726867159.35207: variable 'ansible_search_path' from source: unknown 11000 1726867159.35234: calling self._execute() 11000 1726867159.35306: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.35310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.35319: variable 'omit' from source: magic vars 11000 1726867159.35571: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.35582: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.35588: _execute() done 11000 1726867159.35594: dumping result to json 11000 1726867159.35596: done dumping result, returning 11000 1726867159.35603: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [0affcac9-a3a5-c734-026a-00000000026e] 11000 1726867159.35607: sending task result for task 0affcac9-a3a5-c734-026a-00000000026e 11000 1726867159.35690: done sending task result for task 0affcac9-a3a5-c734-026a-00000000026e 11000 1726867159.35693: WORKER PROCESS EXITING 11000 1726867159.35745: no more pending results, returning what we have 11000 1726867159.35749: in VariableManager get_vars() 11000 1726867159.35786: Calling all_inventory to load vars for managed_node1 11000 1726867159.35789: Calling groups_inventory to load vars for managed_node1 11000 1726867159.35791: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.35800: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.35802: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.35804: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.36631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.37474: done with get_vars() 11000 1726867159.37488: variable 'ansible_search_path' from source: unknown 11000 1726867159.37489: variable 'ansible_search_path' from source: unknown 11000 1726867159.37512: we have included files to process 11000 1726867159.37513: generating all_blocks data 11000 1726867159.37514: done generating all_blocks data 11000 1726867159.37517: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867159.37517: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867159.37519: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11000 1726867159.38090: done processing included file 11000 1726867159.38092: iterating over new_blocks loaded from include file 11000 1726867159.38093: in VariableManager get_vars() 11000 1726867159.38105: done with get_vars() 11000 1726867159.38106: filtering new block on tags 11000 1726867159.38123: done filtering new block on tags 11000 1726867159.38125: in VariableManager get_vars() 11000 1726867159.38137: done with get_vars() 11000 1726867159.38138: filtering new block on tags 11000 1726867159.38150: done filtering new block on tags 11000 1726867159.38151: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11000 1726867159.38154: extending task lists for all hosts with included blocks 11000 1726867159.38254: done extending task lists 11000 1726867159.38255: done processing included files 11000 1726867159.38256: results queue empty 11000 1726867159.38256: checking for any_errors_fatal 11000 1726867159.38258: done checking for any_errors_fatal 11000 1726867159.38258: checking for max_fail_percentage 11000 1726867159.38259: done checking for max_fail_percentage 11000 1726867159.38259: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.38260: done checking to see if all hosts have failed 11000 1726867159.38260: getting the remaining hosts for this loop 11000 1726867159.38261: done getting the remaining hosts for this loop 11000 1726867159.38262: getting the next task for host managed_node1 11000 1726867159.38265: done getting next task for host managed_node1 11000 1726867159.38266: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11000 1726867159.38268: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.38270: getting variables 11000 1726867159.38270: in VariableManager get_vars() 11000 1726867159.38281: Calling all_inventory to load vars for managed_node1 11000 1726867159.38283: Calling groups_inventory to load vars for managed_node1 11000 1726867159.38284: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.38287: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.38289: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.38291: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.38913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.39793: done with get_vars() 11000 1726867159.39806: done getting variables 11000 1726867159.39831: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 17:19:19 -0400 (0:00:00.051) 0:00:21.041 ****** 11000 1726867159.39850: entering _queue_task() for managed_node1/set_fact 11000 1726867159.40068: worker is 1 (out of 1 available) 11000 1726867159.40081: exiting _queue_task() for managed_node1/set_fact 11000 1726867159.40094: done queuing things up, now waiting for results queue to drain 11000 1726867159.40095: waiting for pending results... 11000 1726867159.40255: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11000 1726867159.40325: in run() - task 0affcac9-a3a5-c734-026a-000000000443 11000 1726867159.40335: variable 'ansible_search_path' from source: unknown 11000 1726867159.40339: variable 'ansible_search_path' from source: unknown 11000 1726867159.40365: calling self._execute() 11000 1726867159.40446: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.40450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.40459: variable 'omit' from source: magic vars 11000 1726867159.40728: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.40737: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.40743: variable 'omit' from source: magic vars 11000 1726867159.40781: variable 'omit' from source: magic vars 11000 1726867159.40807: variable 'omit' from source: magic vars 11000 1726867159.40837: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867159.40870: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867159.40884: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867159.40899: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.40910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.40933: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867159.40936: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.40939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.41006: Set connection var ansible_shell_type to sh 11000 1726867159.41013: Set connection var ansible_pipelining to False 11000 1726867159.41020: Set connection var ansible_shell_executable to /bin/sh 11000 1726867159.41023: Set connection var ansible_connection to ssh 11000 1726867159.41027: Set connection var ansible_timeout to 10 11000 1726867159.41032: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867159.41051: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.41054: variable 'ansible_connection' from source: unknown 11000 1726867159.41057: variable 'ansible_module_compression' from source: unknown 11000 1726867159.41059: variable 'ansible_shell_type' from source: unknown 11000 1726867159.41062: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.41063: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.41075: variable 'ansible_pipelining' from source: unknown 11000 1726867159.41083: variable 'ansible_timeout' from source: unknown 11000 1726867159.41090: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.41179: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867159.41192: variable 'omit' from source: magic vars 11000 1726867159.41195: starting attempt loop 11000 1726867159.41198: running the handler 11000 1726867159.41211: handler run complete 11000 1726867159.41218: attempt loop complete, returning result 11000 1726867159.41222: _execute() done 11000 1726867159.41224: dumping result to json 11000 1726867159.41227: done dumping result, returning 11000 1726867159.41234: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcac9-a3a5-c734-026a-000000000443] 11000 1726867159.41238: sending task result for task 0affcac9-a3a5-c734-026a-000000000443 ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11000 1726867159.41364: no more pending results, returning what we have 11000 1726867159.41366: results queue empty 11000 1726867159.41367: checking for any_errors_fatal 11000 1726867159.41368: done checking for any_errors_fatal 11000 1726867159.41369: checking for max_fail_percentage 11000 1726867159.41371: done checking for max_fail_percentage 11000 1726867159.41372: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.41372: done checking to see if all hosts have failed 11000 1726867159.41373: getting the remaining hosts for this loop 11000 1726867159.41374: done getting the remaining hosts for this loop 11000 1726867159.41379: getting the next task for host managed_node1 11000 1726867159.41386: done getting next task for host managed_node1 11000 1726867159.41391: ^ task is: TASK: Stat profile file 11000 1726867159.41395: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.41398: getting variables 11000 1726867159.41399: in VariableManager get_vars() 11000 1726867159.41440: Calling all_inventory to load vars for managed_node1 11000 1726867159.41442: Calling groups_inventory to load vars for managed_node1 11000 1726867159.41444: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.41450: done sending task result for task 0affcac9-a3a5-c734-026a-000000000443 11000 1726867159.41452: WORKER PROCESS EXITING 11000 1726867159.41460: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.41462: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.41465: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.42197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.43047: done with get_vars() 11000 1726867159.43061: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 17:19:19 -0400 (0:00:00.032) 0:00:21.074 ****** 11000 1726867159.43123: entering _queue_task() for managed_node1/stat 11000 1726867159.43322: worker is 1 (out of 1 available) 11000 1726867159.43335: exiting _queue_task() for managed_node1/stat 11000 1726867159.43346: done queuing things up, now waiting for results queue to drain 11000 1726867159.43347: waiting for pending results... 11000 1726867159.43500: running TaskExecutor() for managed_node1/TASK: Stat profile file 11000 1726867159.43563: in run() - task 0affcac9-a3a5-c734-026a-000000000444 11000 1726867159.43575: variable 'ansible_search_path' from source: unknown 11000 1726867159.43580: variable 'ansible_search_path' from source: unknown 11000 1726867159.43607: calling self._execute() 11000 1726867159.43680: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.43684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.43692: variable 'omit' from source: magic vars 11000 1726867159.43948: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.43957: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.43963: variable 'omit' from source: magic vars 11000 1726867159.43995: variable 'omit' from source: magic vars 11000 1726867159.44065: variable 'profile' from source: include params 11000 1726867159.44069: variable 'item' from source: include params 11000 1726867159.44114: variable 'item' from source: include params 11000 1726867159.44130: variable 'omit' from source: magic vars 11000 1726867159.44165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867159.44189: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867159.44202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867159.44215: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.44227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.44248: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867159.44252: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.44254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.44319: Set connection var ansible_shell_type to sh 11000 1726867159.44325: Set connection var ansible_pipelining to False 11000 1726867159.44332: Set connection var ansible_shell_executable to /bin/sh 11000 1726867159.44336: Set connection var ansible_connection to ssh 11000 1726867159.44341: Set connection var ansible_timeout to 10 11000 1726867159.44346: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867159.44365: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.44368: variable 'ansible_connection' from source: unknown 11000 1726867159.44371: variable 'ansible_module_compression' from source: unknown 11000 1726867159.44380: variable 'ansible_shell_type' from source: unknown 11000 1726867159.44384: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.44386: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.44391: variable 'ansible_pipelining' from source: unknown 11000 1726867159.44394: variable 'ansible_timeout' from source: unknown 11000 1726867159.44396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.44525: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867159.44533: variable 'omit' from source: magic vars 11000 1726867159.44538: starting attempt loop 11000 1726867159.44541: running the handler 11000 1726867159.44551: _low_level_execute_command(): starting 11000 1726867159.44558: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867159.45044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.45079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867159.45083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.45086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867159.45088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867159.45090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.45141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867159.45144: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.45146: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.45206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.46867: stdout chunk (state=3): >>>/root <<< 11000 1726867159.46968: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.46998: stderr chunk (state=3): >>><<< 11000 1726867159.47001: stdout chunk (state=3): >>><<< 11000 1726867159.47016: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867159.47027: _low_level_execute_command(): starting 11000 1726867159.47033: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077 `" && echo ansible-tmp-1726867159.4701588-11981-31998921819077="` echo /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077 `" ) && sleep 0' 11000 1726867159.47460: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.47464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.47474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.47478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.47523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867159.47529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.47573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.49440: stdout chunk (state=3): >>>ansible-tmp-1726867159.4701588-11981-31998921819077=/root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077 <<< 11000 1726867159.49544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.49566: stderr chunk (state=3): >>><<< 11000 1726867159.49569: stdout chunk (state=3): >>><<< 11000 1726867159.49583: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867159.4701588-11981-31998921819077=/root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867159.49619: variable 'ansible_module_compression' from source: unknown 11000 1726867159.49656: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11000 1726867159.49690: variable 'ansible_facts' from source: unknown 11000 1726867159.49746: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/AnsiballZ_stat.py 11000 1726867159.49834: Sending initial data 11000 1726867159.49838: Sent initial data (152 bytes) 11000 1726867159.50259: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867159.50263: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867159.50265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11000 1726867159.50268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867159.50270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.50317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.50325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.50370: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.51883: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11000 1726867159.51886: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867159.51922: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867159.51969: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmphmxptdfj /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/AnsiballZ_stat.py <<< 11000 1726867159.51972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/AnsiballZ_stat.py" <<< 11000 1726867159.52014: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmphmxptdfj" to remote "/root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/AnsiballZ_stat.py" <<< 11000 1726867159.52551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.52589: stderr chunk (state=3): >>><<< 11000 1726867159.52593: stdout chunk (state=3): >>><<< 11000 1726867159.52619: done transferring module to remote 11000 1726867159.52627: _low_level_execute_command(): starting 11000 1726867159.52629: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/ /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/AnsiballZ_stat.py && sleep 0' 11000 1726867159.53022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.53025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.53027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.53029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.53069: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.53085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.53142: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.54848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.54869: stderr chunk (state=3): >>><<< 11000 1726867159.54873: stdout chunk (state=3): >>><<< 11000 1726867159.54892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867159.54895: _low_level_execute_command(): starting 11000 1726867159.54898: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/AnsiballZ_stat.py && sleep 0' 11000 1726867159.55280: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.55283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.55286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867159.55288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867159.55291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.55334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.55337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.55420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.70563: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11000 1726867159.71894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867159.71925: stderr chunk (state=3): >>><<< 11000 1726867159.71928: stdout chunk (state=3): >>><<< 11000 1726867159.71944: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867159.71967: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867159.71976: _low_level_execute_command(): starting 11000 1726867159.71983: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867159.4701588-11981-31998921819077/ > /dev/null 2>&1 && sleep 0' 11000 1726867159.72552: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867159.72583: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867159.72586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.72592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867159.72594: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867159.72597: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867159.72627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.72630: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867159.72633: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867159.72635: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867159.72711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.72750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.72803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.74659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.74885: stderr chunk (state=3): >>><<< 11000 1726867159.74891: stdout chunk (state=3): >>><<< 11000 1726867159.74895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867159.74897: handler run complete 11000 1726867159.74899: attempt loop complete, returning result 11000 1726867159.74901: _execute() done 11000 1726867159.74903: dumping result to json 11000 1726867159.74905: done dumping result, returning 11000 1726867159.74907: done running TaskExecutor() for managed_node1/TASK: Stat profile file [0affcac9-a3a5-c734-026a-000000000444] 11000 1726867159.74909: sending task result for task 0affcac9-a3a5-c734-026a-000000000444 11000 1726867159.74982: done sending task result for task 0affcac9-a3a5-c734-026a-000000000444 11000 1726867159.74985: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11000 1726867159.75051: no more pending results, returning what we have 11000 1726867159.75054: results queue empty 11000 1726867159.75056: checking for any_errors_fatal 11000 1726867159.75062: done checking for any_errors_fatal 11000 1726867159.75063: checking for max_fail_percentage 11000 1726867159.75064: done checking for max_fail_percentage 11000 1726867159.75065: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.75066: done checking to see if all hosts have failed 11000 1726867159.75067: getting the remaining hosts for this loop 11000 1726867159.75068: done getting the remaining hosts for this loop 11000 1726867159.75072: getting the next task for host managed_node1 11000 1726867159.75085: done getting next task for host managed_node1 11000 1726867159.75090: ^ task is: TASK: Set NM profile exist flag based on the profile files 11000 1726867159.75094: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.75099: getting variables 11000 1726867159.75100: in VariableManager get_vars() 11000 1726867159.75143: Calling all_inventory to load vars for managed_node1 11000 1726867159.75147: Calling groups_inventory to load vars for managed_node1 11000 1726867159.75149: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.75161: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.75163: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.75166: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.76350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.77195: done with get_vars() 11000 1726867159.77211: done getting variables 11000 1726867159.77267: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 17:19:19 -0400 (0:00:00.341) 0:00:21.416 ****** 11000 1726867159.77304: entering _queue_task() for managed_node1/set_fact 11000 1726867159.77583: worker is 1 (out of 1 available) 11000 1726867159.77599: exiting _queue_task() for managed_node1/set_fact 11000 1726867159.77610: done queuing things up, now waiting for results queue to drain 11000 1726867159.77611: waiting for pending results... 11000 1726867159.77996: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11000 1726867159.78001: in run() - task 0affcac9-a3a5-c734-026a-000000000445 11000 1726867159.78014: variable 'ansible_search_path' from source: unknown 11000 1726867159.78019: variable 'ansible_search_path' from source: unknown 11000 1726867159.78055: calling self._execute() 11000 1726867159.78161: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.78172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.78194: variable 'omit' from source: magic vars 11000 1726867159.78561: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.78581: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.78709: variable 'profile_stat' from source: set_fact 11000 1726867159.78733: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867159.78746: when evaluation is False, skipping this task 11000 1726867159.78755: _execute() done 11000 1726867159.78763: dumping result to json 11000 1726867159.78771: done dumping result, returning 11000 1726867159.78784: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [0affcac9-a3a5-c734-026a-000000000445] 11000 1726867159.78796: sending task result for task 0affcac9-a3a5-c734-026a-000000000445 11000 1726867159.79082: done sending task result for task 0affcac9-a3a5-c734-026a-000000000445 11000 1726867159.79086: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867159.79128: no more pending results, returning what we have 11000 1726867159.79132: results queue empty 11000 1726867159.79133: checking for any_errors_fatal 11000 1726867159.79138: done checking for any_errors_fatal 11000 1726867159.79139: checking for max_fail_percentage 11000 1726867159.79140: done checking for max_fail_percentage 11000 1726867159.79141: checking to see if all hosts have failed and the running result is not ok 11000 1726867159.79142: done checking to see if all hosts have failed 11000 1726867159.79143: getting the remaining hosts for this loop 11000 1726867159.79144: done getting the remaining hosts for this loop 11000 1726867159.79148: getting the next task for host managed_node1 11000 1726867159.79154: done getting next task for host managed_node1 11000 1726867159.79156: ^ task is: TASK: Get NM profile info 11000 1726867159.79160: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867159.79164: getting variables 11000 1726867159.79166: in VariableManager get_vars() 11000 1726867159.79206: Calling all_inventory to load vars for managed_node1 11000 1726867159.79209: Calling groups_inventory to load vars for managed_node1 11000 1726867159.79211: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867159.79221: Calling all_plugins_play to load vars for managed_node1 11000 1726867159.79223: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867159.79227: Calling groups_plugins_play to load vars for managed_node1 11000 1726867159.80541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867159.82024: done with get_vars() 11000 1726867159.82040: done getting variables 11000 1726867159.82089: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 17:19:19 -0400 (0:00:00.048) 0:00:21.464 ****** 11000 1726867159.82110: entering _queue_task() for managed_node1/shell 11000 1726867159.82316: worker is 1 (out of 1 available) 11000 1726867159.82330: exiting _queue_task() for managed_node1/shell 11000 1726867159.82341: done queuing things up, now waiting for results queue to drain 11000 1726867159.82343: waiting for pending results... 11000 1726867159.82508: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11000 1726867159.82572: in run() - task 0affcac9-a3a5-c734-026a-000000000446 11000 1726867159.82586: variable 'ansible_search_path' from source: unknown 11000 1726867159.82592: variable 'ansible_search_path' from source: unknown 11000 1726867159.82617: calling self._execute() 11000 1726867159.82695: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.82699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.82708: variable 'omit' from source: magic vars 11000 1726867159.82969: variable 'ansible_distribution_major_version' from source: facts 11000 1726867159.82981: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867159.82986: variable 'omit' from source: magic vars 11000 1726867159.83018: variable 'omit' from source: magic vars 11000 1726867159.83092: variable 'profile' from source: include params 11000 1726867159.83096: variable 'item' from source: include params 11000 1726867159.83141: variable 'item' from source: include params 11000 1726867159.83155: variable 'omit' from source: magic vars 11000 1726867159.83190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867159.83215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867159.83232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867159.83246: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.83256: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867159.83281: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867159.83284: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.83287: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.83352: Set connection var ansible_shell_type to sh 11000 1726867159.83359: Set connection var ansible_pipelining to False 11000 1726867159.83367: Set connection var ansible_shell_executable to /bin/sh 11000 1726867159.83370: Set connection var ansible_connection to ssh 11000 1726867159.83374: Set connection var ansible_timeout to 10 11000 1726867159.83380: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867159.83402: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.83405: variable 'ansible_connection' from source: unknown 11000 1726867159.83407: variable 'ansible_module_compression' from source: unknown 11000 1726867159.83409: variable 'ansible_shell_type' from source: unknown 11000 1726867159.83412: variable 'ansible_shell_executable' from source: unknown 11000 1726867159.83414: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867159.83416: variable 'ansible_pipelining' from source: unknown 11000 1726867159.83419: variable 'ansible_timeout' from source: unknown 11000 1726867159.83424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867159.83523: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867159.83562: variable 'omit' from source: magic vars 11000 1726867159.83565: starting attempt loop 11000 1726867159.83568: running the handler 11000 1726867159.83571: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867159.83574: _low_level_execute_command(): starting 11000 1726867159.83582: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867159.84293: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.84338: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.84384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.85996: stdout chunk (state=3): >>>/root <<< 11000 1726867159.86096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.86129: stderr chunk (state=3): >>><<< 11000 1726867159.86132: stdout chunk (state=3): >>><<< 11000 1726867159.86143: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867159.86155: _low_level_execute_command(): starting 11000 1726867159.86160: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770 `" && echo ansible-tmp-1726867159.8614397-12000-159209430134770="` echo /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770 `" ) && sleep 0' 11000 1726867159.86635: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.86676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.86697: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.86776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.88654: stdout chunk (state=3): >>>ansible-tmp-1726867159.8614397-12000-159209430134770=/root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770 <<< 11000 1726867159.88817: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.88820: stdout chunk (state=3): >>><<< 11000 1726867159.88825: stderr chunk (state=3): >>><<< 11000 1726867159.88848: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867159.8614397-12000-159209430134770=/root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867159.88982: variable 'ansible_module_compression' from source: unknown 11000 1726867159.88986: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867159.88991: variable 'ansible_facts' from source: unknown 11000 1726867159.89091: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/AnsiballZ_command.py 11000 1726867159.89231: Sending initial data 11000 1726867159.89341: Sent initial data (156 bytes) 11000 1726867159.89932: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.89955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.90026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.91542: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867159.91610: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867159.91674: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpkzhf5bux /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/AnsiballZ_command.py <<< 11000 1726867159.91685: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/AnsiballZ_command.py" <<< 11000 1726867159.91711: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpkzhf5bux" to remote "/root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/AnsiballZ_command.py" <<< 11000 1726867159.92456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.92531: stderr chunk (state=3): >>><<< 11000 1726867159.92535: stdout chunk (state=3): >>><<< 11000 1726867159.92641: done transferring module to remote 11000 1726867159.92644: _low_level_execute_command(): starting 11000 1726867159.92647: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/ /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/AnsiballZ_command.py && sleep 0' 11000 1726867159.93242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867159.93260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867159.93305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867159.93387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.93407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.93427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.93509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867159.95364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867159.95367: stdout chunk (state=3): >>><<< 11000 1726867159.95369: stderr chunk (state=3): >>><<< 11000 1726867159.95371: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867159.95381: _low_level_execute_command(): starting 11000 1726867159.95384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/AnsiballZ_command.py && sleep 0' 11000 1726867159.95887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867159.95906: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867159.95924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867159.95998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867159.96046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867159.96061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867159.96083: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867159.96160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867160.13340: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 17:19:20.110797", "end": "2024-09-20 17:19:20.131662", "delta": "0:00:00.020865", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867160.14882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867160.14910: stderr chunk (state=3): >>><<< 11000 1726867160.14913: stdout chunk (state=3): >>><<< 11000 1726867160.14927: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 17:19:20.110797", "end": "2024-09-20 17:19:20.131662", "delta": "0:00:00.020865", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867160.14960: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867160.14967: _low_level_execute_command(): starting 11000 1726867160.14972: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867159.8614397-12000-159209430134770/ > /dev/null 2>&1 && sleep 0' 11000 1726867160.15408: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867160.15411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867160.15414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867160.15416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867160.15418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867160.15466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867160.15469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867160.15474: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867160.15520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867160.17317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867160.17342: stderr chunk (state=3): >>><<< 11000 1726867160.17345: stdout chunk (state=3): >>><<< 11000 1726867160.17356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867160.17364: handler run complete 11000 1726867160.17381: Evaluated conditional (False): False 11000 1726867160.17391: attempt loop complete, returning result 11000 1726867160.17394: _execute() done 11000 1726867160.17397: dumping result to json 11000 1726867160.17402: done dumping result, returning 11000 1726867160.17410: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [0affcac9-a3a5-c734-026a-000000000446] 11000 1726867160.17413: sending task result for task 0affcac9-a3a5-c734-026a-000000000446 11000 1726867160.17652: done sending task result for task 0affcac9-a3a5-c734-026a-000000000446 11000 1726867160.17656: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.020865", "end": "2024-09-20 17:19:20.131662", "rc": 0, "start": "2024-09-20 17:19:20.110797" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11000 1726867160.17758: no more pending results, returning what we have 11000 1726867160.17761: results queue empty 11000 1726867160.17762: checking for any_errors_fatal 11000 1726867160.17768: done checking for any_errors_fatal 11000 1726867160.17769: checking for max_fail_percentage 11000 1726867160.17770: done checking for max_fail_percentage 11000 1726867160.17771: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.17772: done checking to see if all hosts have failed 11000 1726867160.17773: getting the remaining hosts for this loop 11000 1726867160.17774: done getting the remaining hosts for this loop 11000 1726867160.17780: getting the next task for host managed_node1 11000 1726867160.17789: done getting next task for host managed_node1 11000 1726867160.17792: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11000 1726867160.17796: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.17800: getting variables 11000 1726867160.17801: in VariableManager get_vars() 11000 1726867160.17842: Calling all_inventory to load vars for managed_node1 11000 1726867160.17844: Calling groups_inventory to load vars for managed_node1 11000 1726867160.17846: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.17857: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.17859: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.17862: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.19358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.20221: done with get_vars() 11000 1726867160.20237: done getting variables 11000 1726867160.20282: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 17:19:20 -0400 (0:00:00.381) 0:00:21.846 ****** 11000 1726867160.20307: entering _queue_task() for managed_node1/set_fact 11000 1726867160.20540: worker is 1 (out of 1 available) 11000 1726867160.20553: exiting _queue_task() for managed_node1/set_fact 11000 1726867160.20564: done queuing things up, now waiting for results queue to drain 11000 1726867160.20565: waiting for pending results... 11000 1726867160.20730: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11000 1726867160.20804: in run() - task 0affcac9-a3a5-c734-026a-000000000447 11000 1726867160.20815: variable 'ansible_search_path' from source: unknown 11000 1726867160.20818: variable 'ansible_search_path' from source: unknown 11000 1726867160.20850: calling self._execute() 11000 1726867160.20926: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.20929: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.20938: variable 'omit' from source: magic vars 11000 1726867160.21203: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.21213: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.21302: variable 'nm_profile_exists' from source: set_fact 11000 1726867160.21315: Evaluated conditional (nm_profile_exists.rc == 0): True 11000 1726867160.21320: variable 'omit' from source: magic vars 11000 1726867160.21359: variable 'omit' from source: magic vars 11000 1726867160.21381: variable 'omit' from source: magic vars 11000 1726867160.21413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867160.21438: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867160.21457: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867160.21470: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.21562: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.21565: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867160.21568: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.21570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.21572: Set connection var ansible_shell_type to sh 11000 1726867160.21579: Set connection var ansible_pipelining to False 11000 1726867160.21587: Set connection var ansible_shell_executable to /bin/sh 11000 1726867160.21592: Set connection var ansible_connection to ssh 11000 1726867160.21595: Set connection var ansible_timeout to 10 11000 1726867160.21599: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867160.21620: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.21623: variable 'ansible_connection' from source: unknown 11000 1726867160.21625: variable 'ansible_module_compression' from source: unknown 11000 1726867160.21628: variable 'ansible_shell_type' from source: unknown 11000 1726867160.21630: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.21633: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.21636: variable 'ansible_pipelining' from source: unknown 11000 1726867160.21638: variable 'ansible_timeout' from source: unknown 11000 1726867160.21643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.21743: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867160.21752: variable 'omit' from source: magic vars 11000 1726867160.21757: starting attempt loop 11000 1726867160.21760: running the handler 11000 1726867160.21772: handler run complete 11000 1726867160.21781: attempt loop complete, returning result 11000 1726867160.21784: _execute() done 11000 1726867160.21790: dumping result to json 11000 1726867160.21792: done dumping result, returning 11000 1726867160.21798: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcac9-a3a5-c734-026a-000000000447] 11000 1726867160.21802: sending task result for task 0affcac9-a3a5-c734-026a-000000000447 11000 1726867160.21873: done sending task result for task 0affcac9-a3a5-c734-026a-000000000447 11000 1726867160.21876: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11000 1726867160.21933: no more pending results, returning what we have 11000 1726867160.21936: results queue empty 11000 1726867160.21937: checking for any_errors_fatal 11000 1726867160.21945: done checking for any_errors_fatal 11000 1726867160.21946: checking for max_fail_percentage 11000 1726867160.21947: done checking for max_fail_percentage 11000 1726867160.21948: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.21949: done checking to see if all hosts have failed 11000 1726867160.21950: getting the remaining hosts for this loop 11000 1726867160.21951: done getting the remaining hosts for this loop 11000 1726867160.21954: getting the next task for host managed_node1 11000 1726867160.21962: done getting next task for host managed_node1 11000 1726867160.21964: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11000 1726867160.21967: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.21971: getting variables 11000 1726867160.21972: in VariableManager get_vars() 11000 1726867160.22012: Calling all_inventory to load vars for managed_node1 11000 1726867160.22015: Calling groups_inventory to load vars for managed_node1 11000 1726867160.22017: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.22025: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.22028: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.22030: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.22757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.23691: done with get_vars() 11000 1726867160.23707: done getting variables 11000 1726867160.23751: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867160.23835: variable 'profile' from source: include params 11000 1726867160.23838: variable 'item' from source: include params 11000 1726867160.23880: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 17:19:20 -0400 (0:00:00.036) 0:00:21.882 ****** 11000 1726867160.23909: entering _queue_task() for managed_node1/command 11000 1726867160.24135: worker is 1 (out of 1 available) 11000 1726867160.24147: exiting _queue_task() for managed_node1/command 11000 1726867160.24160: done queuing things up, now waiting for results queue to drain 11000 1726867160.24162: waiting for pending results... 11000 1726867160.24323: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11000 1726867160.24401: in run() - task 0affcac9-a3a5-c734-026a-000000000449 11000 1726867160.24413: variable 'ansible_search_path' from source: unknown 11000 1726867160.24417: variable 'ansible_search_path' from source: unknown 11000 1726867160.24444: calling self._execute() 11000 1726867160.24524: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.24528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.24537: variable 'omit' from source: magic vars 11000 1726867160.24797: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.24807: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.24894: variable 'profile_stat' from source: set_fact 11000 1726867160.24904: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867160.24906: when evaluation is False, skipping this task 11000 1726867160.24911: _execute() done 11000 1726867160.24914: dumping result to json 11000 1726867160.24916: done dumping result, returning 11000 1726867160.24924: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [0affcac9-a3a5-c734-026a-000000000449] 11000 1726867160.24926: sending task result for task 0affcac9-a3a5-c734-026a-000000000449 11000 1726867160.25008: done sending task result for task 0affcac9-a3a5-c734-026a-000000000449 11000 1726867160.25011: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867160.25091: no more pending results, returning what we have 11000 1726867160.25095: results queue empty 11000 1726867160.25096: checking for any_errors_fatal 11000 1726867160.25101: done checking for any_errors_fatal 11000 1726867160.25102: checking for max_fail_percentage 11000 1726867160.25103: done checking for max_fail_percentage 11000 1726867160.25104: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.25105: done checking to see if all hosts have failed 11000 1726867160.25105: getting the remaining hosts for this loop 11000 1726867160.25107: done getting the remaining hosts for this loop 11000 1726867160.25110: getting the next task for host managed_node1 11000 1726867160.25116: done getting next task for host managed_node1 11000 1726867160.25118: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11000 1726867160.25121: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.25125: getting variables 11000 1726867160.25126: in VariableManager get_vars() 11000 1726867160.25157: Calling all_inventory to load vars for managed_node1 11000 1726867160.25159: Calling groups_inventory to load vars for managed_node1 11000 1726867160.25161: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.25170: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.25173: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.25175: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.25926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.26782: done with get_vars() 11000 1726867160.26801: done getting variables 11000 1726867160.26848: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867160.26929: variable 'profile' from source: include params 11000 1726867160.26932: variable 'item' from source: include params 11000 1726867160.26971: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 17:19:20 -0400 (0:00:00.030) 0:00:21.913 ****** 11000 1726867160.26999: entering _queue_task() for managed_node1/set_fact 11000 1726867160.27320: worker is 1 (out of 1 available) 11000 1726867160.27333: exiting _queue_task() for managed_node1/set_fact 11000 1726867160.27346: done queuing things up, now waiting for results queue to drain 11000 1726867160.27348: waiting for pending results... 11000 1726867160.27799: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11000 1726867160.27805: in run() - task 0affcac9-a3a5-c734-026a-00000000044a 11000 1726867160.27809: variable 'ansible_search_path' from source: unknown 11000 1726867160.27812: variable 'ansible_search_path' from source: unknown 11000 1726867160.27839: calling self._execute() 11000 1726867160.27956: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.27969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.27991: variable 'omit' from source: magic vars 11000 1726867160.28383: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.28411: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.28543: variable 'profile_stat' from source: set_fact 11000 1726867160.28565: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867160.28574: when evaluation is False, skipping this task 11000 1726867160.28596: _execute() done 11000 1726867160.28606: dumping result to json 11000 1726867160.28615: done dumping result, returning 11000 1726867160.28682: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [0affcac9-a3a5-c734-026a-00000000044a] 11000 1726867160.28685: sending task result for task 0affcac9-a3a5-c734-026a-00000000044a 11000 1726867160.28759: done sending task result for task 0affcac9-a3a5-c734-026a-00000000044a 11000 1726867160.28762: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867160.28849: no more pending results, returning what we have 11000 1726867160.28854: results queue empty 11000 1726867160.28855: checking for any_errors_fatal 11000 1726867160.28862: done checking for any_errors_fatal 11000 1726867160.28863: checking for max_fail_percentage 11000 1726867160.28864: done checking for max_fail_percentage 11000 1726867160.28865: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.28866: done checking to see if all hosts have failed 11000 1726867160.28867: getting the remaining hosts for this loop 11000 1726867160.28869: done getting the remaining hosts for this loop 11000 1726867160.28872: getting the next task for host managed_node1 11000 1726867160.28881: done getting next task for host managed_node1 11000 1726867160.28883: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11000 1726867160.28891: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.28896: getting variables 11000 1726867160.28898: in VariableManager get_vars() 11000 1726867160.28941: Calling all_inventory to load vars for managed_node1 11000 1726867160.28944: Calling groups_inventory to load vars for managed_node1 11000 1726867160.28947: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.28960: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.28964: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.28967: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.30704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.37264: done with get_vars() 11000 1726867160.37336: done getting variables 11000 1726867160.37393: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867160.37498: variable 'profile' from source: include params 11000 1726867160.37502: variable 'item' from source: include params 11000 1726867160.37648: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 17:19:20 -0400 (0:00:00.106) 0:00:22.019 ****** 11000 1726867160.37679: entering _queue_task() for managed_node1/command 11000 1726867160.38114: worker is 1 (out of 1 available) 11000 1726867160.38127: exiting _queue_task() for managed_node1/command 11000 1726867160.38145: done queuing things up, now waiting for results queue to drain 11000 1726867160.38147: waiting for pending results... 11000 1726867160.38348: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 11000 1726867160.38449: in run() - task 0affcac9-a3a5-c734-026a-00000000044b 11000 1726867160.38466: variable 'ansible_search_path' from source: unknown 11000 1726867160.38469: variable 'ansible_search_path' from source: unknown 11000 1726867160.38504: calling self._execute() 11000 1726867160.38607: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.38613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.38624: variable 'omit' from source: magic vars 11000 1726867160.38995: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.39011: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.39132: variable 'profile_stat' from source: set_fact 11000 1726867160.39147: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867160.39150: when evaluation is False, skipping this task 11000 1726867160.39153: _execute() done 11000 1726867160.39155: dumping result to json 11000 1726867160.39158: done dumping result, returning 11000 1726867160.39167: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [0affcac9-a3a5-c734-026a-00000000044b] 11000 1726867160.39172: sending task result for task 0affcac9-a3a5-c734-026a-00000000044b 11000 1726867160.39271: done sending task result for task 0affcac9-a3a5-c734-026a-00000000044b 11000 1726867160.39274: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867160.39327: no more pending results, returning what we have 11000 1726867160.39444: results queue empty 11000 1726867160.39445: checking for any_errors_fatal 11000 1726867160.39451: done checking for any_errors_fatal 11000 1726867160.39452: checking for max_fail_percentage 11000 1726867160.39453: done checking for max_fail_percentage 11000 1726867160.39454: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.39455: done checking to see if all hosts have failed 11000 1726867160.39456: getting the remaining hosts for this loop 11000 1726867160.39457: done getting the remaining hosts for this loop 11000 1726867160.39460: getting the next task for host managed_node1 11000 1726867160.39465: done getting next task for host managed_node1 11000 1726867160.39467: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11000 1726867160.39471: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.39475: getting variables 11000 1726867160.39476: in VariableManager get_vars() 11000 1726867160.39516: Calling all_inventory to load vars for managed_node1 11000 1726867160.39518: Calling groups_inventory to load vars for managed_node1 11000 1726867160.39520: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.39530: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.39532: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.39535: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.40934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.42584: done with get_vars() 11000 1726867160.42615: done getting variables 11000 1726867160.42673: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867160.42804: variable 'profile' from source: include params 11000 1726867160.42808: variable 'item' from source: include params 11000 1726867160.42874: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 17:19:20 -0400 (0:00:00.052) 0:00:22.072 ****** 11000 1726867160.42913: entering _queue_task() for managed_node1/set_fact 11000 1726867160.43493: worker is 1 (out of 1 available) 11000 1726867160.43503: exiting _queue_task() for managed_node1/set_fact 11000 1726867160.43514: done queuing things up, now waiting for results queue to drain 11000 1726867160.43516: waiting for pending results... 11000 1726867160.43756: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11000 1726867160.43761: in run() - task 0affcac9-a3a5-c734-026a-00000000044c 11000 1726867160.43765: variable 'ansible_search_path' from source: unknown 11000 1726867160.43768: variable 'ansible_search_path' from source: unknown 11000 1726867160.43807: calling self._execute() 11000 1726867160.43922: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.43936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.43950: variable 'omit' from source: magic vars 11000 1726867160.44359: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.44381: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.44523: variable 'profile_stat' from source: set_fact 11000 1726867160.44542: Evaluated conditional (profile_stat.stat.exists): False 11000 1726867160.44550: when evaluation is False, skipping this task 11000 1726867160.44558: _execute() done 11000 1726867160.44565: dumping result to json 11000 1726867160.44573: done dumping result, returning 11000 1726867160.44586: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [0affcac9-a3a5-c734-026a-00000000044c] 11000 1726867160.44601: sending task result for task 0affcac9-a3a5-c734-026a-00000000044c 11000 1726867160.44797: done sending task result for task 0affcac9-a3a5-c734-026a-00000000044c 11000 1726867160.44801: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11000 1726867160.44874: no more pending results, returning what we have 11000 1726867160.44881: results queue empty 11000 1726867160.44882: checking for any_errors_fatal 11000 1726867160.44890: done checking for any_errors_fatal 11000 1726867160.44891: checking for max_fail_percentage 11000 1726867160.44893: done checking for max_fail_percentage 11000 1726867160.44894: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.44895: done checking to see if all hosts have failed 11000 1726867160.44896: getting the remaining hosts for this loop 11000 1726867160.44898: done getting the remaining hosts for this loop 11000 1726867160.44902: getting the next task for host managed_node1 11000 1726867160.44911: done getting next task for host managed_node1 11000 1726867160.44914: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11000 1726867160.44918: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.44923: getting variables 11000 1726867160.44925: in VariableManager get_vars() 11000 1726867160.44975: Calling all_inventory to load vars for managed_node1 11000 1726867160.44981: Calling groups_inventory to load vars for managed_node1 11000 1726867160.44983: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.45000: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.45003: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.45007: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.46805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.48435: done with get_vars() 11000 1726867160.48458: done getting variables 11000 1726867160.48518: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867160.48641: variable 'profile' from source: include params 11000 1726867160.48645: variable 'item' from source: include params 11000 1726867160.48706: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 17:19:20 -0400 (0:00:00.058) 0:00:22.130 ****** 11000 1726867160.48736: entering _queue_task() for managed_node1/assert 11000 1726867160.49193: worker is 1 (out of 1 available) 11000 1726867160.49204: exiting _queue_task() for managed_node1/assert 11000 1726867160.49217: done queuing things up, now waiting for results queue to drain 11000 1726867160.49219: waiting for pending results... 11000 1726867160.49460: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' 11000 1726867160.49526: in run() - task 0affcac9-a3a5-c734-026a-00000000026f 11000 1726867160.49551: variable 'ansible_search_path' from source: unknown 11000 1726867160.49562: variable 'ansible_search_path' from source: unknown 11000 1726867160.49607: calling self._execute() 11000 1726867160.49774: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.49779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.49783: variable 'omit' from source: magic vars 11000 1726867160.50139: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.50157: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.50169: variable 'omit' from source: magic vars 11000 1726867160.50223: variable 'omit' from source: magic vars 11000 1726867160.50335: variable 'profile' from source: include params 11000 1726867160.50346: variable 'item' from source: include params 11000 1726867160.50415: variable 'item' from source: include params 11000 1726867160.50484: variable 'omit' from source: magic vars 11000 1726867160.50496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867160.50541: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867160.50565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867160.50593: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.50610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.50648: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867160.50682: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.50685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.50772: Set connection var ansible_shell_type to sh 11000 1726867160.50790: Set connection var ansible_pipelining to False 11000 1726867160.50805: Set connection var ansible_shell_executable to /bin/sh 11000 1726867160.50863: Set connection var ansible_connection to ssh 11000 1726867160.50866: Set connection var ansible_timeout to 10 11000 1726867160.50869: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867160.50871: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.50873: variable 'ansible_connection' from source: unknown 11000 1726867160.50875: variable 'ansible_module_compression' from source: unknown 11000 1726867160.50879: variable 'ansible_shell_type' from source: unknown 11000 1726867160.50884: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.50895: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.50903: variable 'ansible_pipelining' from source: unknown 11000 1726867160.50910: variable 'ansible_timeout' from source: unknown 11000 1726867160.50918: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.51068: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867160.51282: variable 'omit' from source: magic vars 11000 1726867160.51285: starting attempt loop 11000 1726867160.51290: running the handler 11000 1726867160.51292: variable 'lsr_net_profile_exists' from source: set_fact 11000 1726867160.51295: Evaluated conditional (lsr_net_profile_exists): True 11000 1726867160.51297: handler run complete 11000 1726867160.51299: attempt loop complete, returning result 11000 1726867160.51301: _execute() done 11000 1726867160.51303: dumping result to json 11000 1726867160.51306: done dumping result, returning 11000 1726867160.51308: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' [0affcac9-a3a5-c734-026a-00000000026f] 11000 1726867160.51310: sending task result for task 0affcac9-a3a5-c734-026a-00000000026f 11000 1726867160.51380: done sending task result for task 0affcac9-a3a5-c734-026a-00000000026f 11000 1726867160.51384: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867160.51434: no more pending results, returning what we have 11000 1726867160.51437: results queue empty 11000 1726867160.51438: checking for any_errors_fatal 11000 1726867160.51445: done checking for any_errors_fatal 11000 1726867160.51446: checking for max_fail_percentage 11000 1726867160.51447: done checking for max_fail_percentage 11000 1726867160.51448: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.51449: done checking to see if all hosts have failed 11000 1726867160.51450: getting the remaining hosts for this loop 11000 1726867160.51451: done getting the remaining hosts for this loop 11000 1726867160.51455: getting the next task for host managed_node1 11000 1726867160.51461: done getting next task for host managed_node1 11000 1726867160.51464: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11000 1726867160.51467: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.51471: getting variables 11000 1726867160.51473: in VariableManager get_vars() 11000 1726867160.51518: Calling all_inventory to load vars for managed_node1 11000 1726867160.51521: Calling groups_inventory to load vars for managed_node1 11000 1726867160.51524: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.51536: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.51539: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.51542: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.53191: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.54828: done with get_vars() 11000 1726867160.54853: done getting variables 11000 1726867160.54916: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867160.55025: variable 'profile' from source: include params 11000 1726867160.55029: variable 'item' from source: include params 11000 1726867160.55092: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 17:19:20 -0400 (0:00:00.063) 0:00:22.194 ****** 11000 1726867160.55127: entering _queue_task() for managed_node1/assert 11000 1726867160.55601: worker is 1 (out of 1 available) 11000 1726867160.55611: exiting _queue_task() for managed_node1/assert 11000 1726867160.55621: done queuing things up, now waiting for results queue to drain 11000 1726867160.55622: waiting for pending results... 11000 1726867160.55861: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11000 1726867160.55866: in run() - task 0affcac9-a3a5-c734-026a-000000000270 11000 1726867160.55869: variable 'ansible_search_path' from source: unknown 11000 1726867160.55872: variable 'ansible_search_path' from source: unknown 11000 1726867160.55892: calling self._execute() 11000 1726867160.56000: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.56012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.56026: variable 'omit' from source: magic vars 11000 1726867160.56407: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.56424: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.56437: variable 'omit' from source: magic vars 11000 1726867160.56479: variable 'omit' from source: magic vars 11000 1726867160.56591: variable 'profile' from source: include params 11000 1726867160.56601: variable 'item' from source: include params 11000 1726867160.56672: variable 'item' from source: include params 11000 1726867160.56723: variable 'omit' from source: magic vars 11000 1726867160.56747: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867160.56790: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867160.56832: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867160.56846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.56882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.56903: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867160.56913: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.56941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.57030: Set connection var ansible_shell_type to sh 11000 1726867160.57051: Set connection var ansible_pipelining to False 11000 1726867160.57082: Set connection var ansible_shell_executable to /bin/sh 11000 1726867160.57085: Set connection var ansible_connection to ssh 11000 1726867160.57090: Set connection var ansible_timeout to 10 11000 1726867160.57093: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867160.57159: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.57162: variable 'ansible_connection' from source: unknown 11000 1726867160.57164: variable 'ansible_module_compression' from source: unknown 11000 1726867160.57166: variable 'ansible_shell_type' from source: unknown 11000 1726867160.57168: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.57170: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.57171: variable 'ansible_pipelining' from source: unknown 11000 1726867160.57174: variable 'ansible_timeout' from source: unknown 11000 1726867160.57179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.57319: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867160.57334: variable 'omit' from source: magic vars 11000 1726867160.57378: starting attempt loop 11000 1726867160.57382: running the handler 11000 1726867160.57461: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11000 1726867160.57472: Evaluated conditional (lsr_net_profile_ansible_managed): True 11000 1726867160.57494: handler run complete 11000 1726867160.57513: attempt loop complete, returning result 11000 1726867160.57597: _execute() done 11000 1726867160.57600: dumping result to json 11000 1726867160.57602: done dumping result, returning 11000 1726867160.57605: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [0affcac9-a3a5-c734-026a-000000000270] 11000 1726867160.57607: sending task result for task 0affcac9-a3a5-c734-026a-000000000270 11000 1726867160.57667: done sending task result for task 0affcac9-a3a5-c734-026a-000000000270 11000 1726867160.57669: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867160.57744: no more pending results, returning what we have 11000 1726867160.57747: results queue empty 11000 1726867160.57749: checking for any_errors_fatal 11000 1726867160.57757: done checking for any_errors_fatal 11000 1726867160.57758: checking for max_fail_percentage 11000 1726867160.57760: done checking for max_fail_percentage 11000 1726867160.57761: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.57762: done checking to see if all hosts have failed 11000 1726867160.57762: getting the remaining hosts for this loop 11000 1726867160.57764: done getting the remaining hosts for this loop 11000 1726867160.57767: getting the next task for host managed_node1 11000 1726867160.57773: done getting next task for host managed_node1 11000 1726867160.57775: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11000 1726867160.57781: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.57785: getting variables 11000 1726867160.57786: in VariableManager get_vars() 11000 1726867160.57832: Calling all_inventory to load vars for managed_node1 11000 1726867160.57835: Calling groups_inventory to load vars for managed_node1 11000 1726867160.57838: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.57849: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.57853: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.57856: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.59563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.61171: done with get_vars() 11000 1726867160.61199: done getting variables 11000 1726867160.61255: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867160.61367: variable 'profile' from source: include params 11000 1726867160.61371: variable 'item' from source: include params 11000 1726867160.61438: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 17:19:20 -0400 (0:00:00.063) 0:00:22.257 ****** 11000 1726867160.61474: entering _queue_task() for managed_node1/assert 11000 1726867160.62011: worker is 1 (out of 1 available) 11000 1726867160.62019: exiting _queue_task() for managed_node1/assert 11000 1726867160.62029: done queuing things up, now waiting for results queue to drain 11000 1726867160.62030: waiting for pending results... 11000 1726867160.62096: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 11000 1726867160.62258: in run() - task 0affcac9-a3a5-c734-026a-000000000271 11000 1726867160.62262: variable 'ansible_search_path' from source: unknown 11000 1726867160.62265: variable 'ansible_search_path' from source: unknown 11000 1726867160.62280: calling self._execute() 11000 1726867160.62393: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.62406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.62476: variable 'omit' from source: magic vars 11000 1726867160.62805: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.62822: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.62834: variable 'omit' from source: magic vars 11000 1726867160.62872: variable 'omit' from source: magic vars 11000 1726867160.62981: variable 'profile' from source: include params 11000 1726867160.62995: variable 'item' from source: include params 11000 1726867160.63065: variable 'item' from source: include params 11000 1726867160.63093: variable 'omit' from source: magic vars 11000 1726867160.63240: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867160.63244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867160.63247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867160.63249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.63251: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.63276: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867160.63291: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.63301: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.63406: Set connection var ansible_shell_type to sh 11000 1726867160.63420: Set connection var ansible_pipelining to False 11000 1726867160.63432: Set connection var ansible_shell_executable to /bin/sh 11000 1726867160.63438: Set connection var ansible_connection to ssh 11000 1726867160.63447: Set connection var ansible_timeout to 10 11000 1726867160.63464: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867160.63498: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.63506: variable 'ansible_connection' from source: unknown 11000 1726867160.63514: variable 'ansible_module_compression' from source: unknown 11000 1726867160.63521: variable 'ansible_shell_type' from source: unknown 11000 1726867160.63527: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.63534: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.63541: variable 'ansible_pipelining' from source: unknown 11000 1726867160.63549: variable 'ansible_timeout' from source: unknown 11000 1726867160.63672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.63717: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867160.63735: variable 'omit' from source: magic vars 11000 1726867160.63746: starting attempt loop 11000 1726867160.63753: running the handler 11000 1726867160.63871: variable 'lsr_net_profile_fingerprint' from source: set_fact 11000 1726867160.63883: Evaluated conditional (lsr_net_profile_fingerprint): True 11000 1726867160.63906: handler run complete 11000 1726867160.64005: attempt loop complete, returning result 11000 1726867160.64008: _execute() done 11000 1726867160.64011: dumping result to json 11000 1726867160.64013: done dumping result, returning 11000 1726867160.64015: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 [0affcac9-a3a5-c734-026a-000000000271] 11000 1726867160.64017: sending task result for task 0affcac9-a3a5-c734-026a-000000000271 11000 1726867160.64111: done sending task result for task 0affcac9-a3a5-c734-026a-000000000271 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11000 1726867160.64160: no more pending results, returning what we have 11000 1726867160.64163: results queue empty 11000 1726867160.64165: checking for any_errors_fatal 11000 1726867160.64171: done checking for any_errors_fatal 11000 1726867160.64172: checking for max_fail_percentage 11000 1726867160.64173: done checking for max_fail_percentage 11000 1726867160.64174: checking to see if all hosts have failed and the running result is not ok 11000 1726867160.64175: done checking to see if all hosts have failed 11000 1726867160.64176: getting the remaining hosts for this loop 11000 1726867160.64179: done getting the remaining hosts for this loop 11000 1726867160.64183: getting the next task for host managed_node1 11000 1726867160.64194: done getting next task for host managed_node1 11000 1726867160.64197: ^ task is: TASK: ** TEST check polling interval 11000 1726867160.64198: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867160.64204: getting variables 11000 1726867160.64205: in VariableManager get_vars() 11000 1726867160.64251: Calling all_inventory to load vars for managed_node1 11000 1726867160.64254: Calling groups_inventory to load vars for managed_node1 11000 1726867160.64256: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867160.64268: Calling all_plugins_play to load vars for managed_node1 11000 1726867160.64271: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867160.64274: Calling groups_plugins_play to load vars for managed_node1 11000 1726867160.65061: WORKER PROCESS EXITING 11000 1726867160.65839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867160.67610: done with get_vars() 11000 1726867160.67629: done getting variables 11000 1726867160.67683: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Friday 20 September 2024 17:19:20 -0400 (0:00:00.062) 0:00:22.320 ****** 11000 1726867160.67718: entering _queue_task() for managed_node1/command 11000 1726867160.68197: worker is 1 (out of 1 available) 11000 1726867160.68208: exiting _queue_task() for managed_node1/command 11000 1726867160.68217: done queuing things up, now waiting for results queue to drain 11000 1726867160.68218: waiting for pending results... 11000 1726867160.68301: running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval 11000 1726867160.68409: in run() - task 0affcac9-a3a5-c734-026a-000000000071 11000 1726867160.68429: variable 'ansible_search_path' from source: unknown 11000 1726867160.68475: calling self._execute() 11000 1726867160.68585: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.68601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.68614: variable 'omit' from source: magic vars 11000 1726867160.69005: variable 'ansible_distribution_major_version' from source: facts 11000 1726867160.69023: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867160.69036: variable 'omit' from source: magic vars 11000 1726867160.69058: variable 'omit' from source: magic vars 11000 1726867160.69165: variable 'controller_device' from source: play vars 11000 1726867160.69200: variable 'omit' from source: magic vars 11000 1726867160.69248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867160.69290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867160.69382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867160.69386: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.69391: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867160.69393: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867160.69395: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.69401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.69505: Set connection var ansible_shell_type to sh 11000 1726867160.69518: Set connection var ansible_pipelining to False 11000 1726867160.69537: Set connection var ansible_shell_executable to /bin/sh 11000 1726867160.69544: Set connection var ansible_connection to ssh 11000 1726867160.69554: Set connection var ansible_timeout to 10 11000 1726867160.69562: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867160.69596: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.69605: variable 'ansible_connection' from source: unknown 11000 1726867160.69612: variable 'ansible_module_compression' from source: unknown 11000 1726867160.69639: variable 'ansible_shell_type' from source: unknown 11000 1726867160.69642: variable 'ansible_shell_executable' from source: unknown 11000 1726867160.69644: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867160.69646: variable 'ansible_pipelining' from source: unknown 11000 1726867160.69648: variable 'ansible_timeout' from source: unknown 11000 1726867160.69652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867160.69859: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867160.69863: variable 'omit' from source: magic vars 11000 1726867160.69865: starting attempt loop 11000 1726867160.69867: running the handler 11000 1726867160.69869: _low_level_execute_command(): starting 11000 1726867160.69872: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867160.70582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867160.70633: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867160.70648: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867160.70661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867160.70744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867160.70764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867160.70784: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867160.70809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867160.70897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867160.72599: stdout chunk (state=3): >>>/root <<< 11000 1726867160.72735: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867160.72764: stderr chunk (state=3): >>><<< 11000 1726867160.72767: stdout chunk (state=3): >>><<< 11000 1726867160.72793: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867160.72884: _low_level_execute_command(): starting 11000 1726867160.72891: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050 `" && echo ansible-tmp-1726867160.7280166-12048-245902944908050="` echo /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050 `" ) && sleep 0' 11000 1726867160.73463: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867160.73575: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867160.73608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867160.73686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867160.75580: stdout chunk (state=3): >>>ansible-tmp-1726867160.7280166-12048-245902944908050=/root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050 <<< 11000 1726867160.75719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867160.75743: stdout chunk (state=3): >>><<< 11000 1726867160.75746: stderr chunk (state=3): >>><<< 11000 1726867160.75982: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867160.7280166-12048-245902944908050=/root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867160.75986: variable 'ansible_module_compression' from source: unknown 11000 1726867160.75991: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867160.75993: variable 'ansible_facts' from source: unknown 11000 1726867160.75995: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/AnsiballZ_command.py 11000 1726867160.76126: Sending initial data 11000 1726867160.76134: Sent initial data (156 bytes) 11000 1726867160.76759: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867160.76780: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867160.76889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867160.76931: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867160.76965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867160.78525: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867160.78590: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867160.78665: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpr5h699ct /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/AnsiballZ_command.py <<< 11000 1726867160.78676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/AnsiballZ_command.py" <<< 11000 1726867160.78706: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpr5h699ct" to remote "/root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/AnsiballZ_command.py" <<< 11000 1726867160.79486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867160.79517: stderr chunk (state=3): >>><<< 11000 1726867160.79625: stdout chunk (state=3): >>><<< 11000 1726867160.79629: done transferring module to remote 11000 1726867160.79631: _low_level_execute_command(): starting 11000 1726867160.79641: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/ /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/AnsiballZ_command.py && sleep 0' 11000 1726867160.80182: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867160.80203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867160.80218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867160.80235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867160.80253: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867160.80266: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867160.80307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867160.80322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867160.80395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867160.80418: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867160.80439: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867160.80520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867160.82298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867160.82383: stdout chunk (state=3): >>><<< 11000 1726867160.82389: stderr chunk (state=3): >>><<< 11000 1726867160.82393: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867160.82395: _low_level_execute_command(): starting 11000 1726867160.82398: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/AnsiballZ_command.py && sleep 0' 11000 1726867160.82964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867160.82982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867160.82999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867160.83016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867160.83037: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867160.83134: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867160.83157: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867160.83171: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867160.83259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867160.98684: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 17:19:20.981550", "end": "2024-09-20 17:19:20.985000", "delta": "0:00:00.003450", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867161.00244: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867161.00264: stdout chunk (state=3): >>><<< 11000 1726867161.00279: stderr chunk (state=3): >>><<< 11000 1726867161.00307: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-20 17:19:20.981550", "end": "2024-09-20 17:19:20.985000", "delta": "0:00:00.003450", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867161.00436: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867161.00439: _low_level_execute_command(): starting 11000 1726867161.00442: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867160.7280166-12048-245902944908050/ > /dev/null 2>&1 && sleep 0' 11000 1726867161.01060: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867161.01074: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867161.01100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.01134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867161.01222: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.01258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867161.01275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.01301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.01381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.03266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.03270: stdout chunk (state=3): >>><<< 11000 1726867161.03272: stderr chunk (state=3): >>><<< 11000 1726867161.03293: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.03491: handler run complete 11000 1726867161.03495: Evaluated conditional (False): False 11000 1726867161.03498: variable 'result' from source: unknown 11000 1726867161.03520: Evaluated conditional ('110' in result.stdout): True 11000 1726867161.03538: attempt loop complete, returning result 11000 1726867161.03545: _execute() done 11000 1726867161.03552: dumping result to json 11000 1726867161.03561: done dumping result, returning 11000 1726867161.03574: done running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval [0affcac9-a3a5-c734-026a-000000000071] 11000 1726867161.03590: sending task result for task 0affcac9-a3a5-c734-026a-000000000071 ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/deprecated-bond" ], "delta": "0:00:00.003450", "end": "2024-09-20 17:19:20.985000", "rc": 0, "start": "2024-09-20 17:19:20.981550" } STDOUT: MII Polling Interval (ms): 110 11000 1726867161.03862: no more pending results, returning what we have 11000 1726867161.03866: results queue empty 11000 1726867161.03867: checking for any_errors_fatal 11000 1726867161.03873: done checking for any_errors_fatal 11000 1726867161.03874: checking for max_fail_percentage 11000 1726867161.03875: done checking for max_fail_percentage 11000 1726867161.03876: checking to see if all hosts have failed and the running result is not ok 11000 1726867161.03879: done checking to see if all hosts have failed 11000 1726867161.03880: getting the remaining hosts for this loop 11000 1726867161.03881: done getting the remaining hosts for this loop 11000 1726867161.03885: getting the next task for host managed_node1 11000 1726867161.03893: done getting next task for host managed_node1 11000 1726867161.03896: ^ task is: TASK: ** TEST check IPv4 11000 1726867161.03899: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867161.03903: getting variables 11000 1726867161.03904: in VariableManager get_vars() 11000 1726867161.03947: Calling all_inventory to load vars for managed_node1 11000 1726867161.03950: Calling groups_inventory to load vars for managed_node1 11000 1726867161.03952: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867161.03964: Calling all_plugins_play to load vars for managed_node1 11000 1726867161.03967: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867161.03970: Calling groups_plugins_play to load vars for managed_node1 11000 1726867161.03989: done sending task result for task 0affcac9-a3a5-c734-026a-000000000071 11000 1726867161.03992: WORKER PROCESS EXITING 11000 1726867161.05706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867161.07419: done with get_vars() 11000 1726867161.07441: done getting variables 11000 1726867161.07511: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:80 Friday 20 September 2024 17:19:21 -0400 (0:00:00.398) 0:00:22.718 ****** 11000 1726867161.07538: entering _queue_task() for managed_node1/command 11000 1726867161.08025: worker is 1 (out of 1 available) 11000 1726867161.08036: exiting _queue_task() for managed_node1/command 11000 1726867161.08046: done queuing things up, now waiting for results queue to drain 11000 1726867161.08047: waiting for pending results... 11000 1726867161.08216: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 11000 1726867161.08325: in run() - task 0affcac9-a3a5-c734-026a-000000000072 11000 1726867161.08352: variable 'ansible_search_path' from source: unknown 11000 1726867161.08400: calling self._execute() 11000 1726867161.08520: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.08532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.08547: variable 'omit' from source: magic vars 11000 1726867161.08963: variable 'ansible_distribution_major_version' from source: facts 11000 1726867161.08984: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867161.09005: variable 'omit' from source: magic vars 11000 1726867161.09033: variable 'omit' from source: magic vars 11000 1726867161.09145: variable 'controller_device' from source: play vars 11000 1726867161.09169: variable 'omit' from source: magic vars 11000 1726867161.09250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867161.09267: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867161.09297: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867161.09322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867161.09358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867161.09386: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867161.09442: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.09446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.09520: Set connection var ansible_shell_type to sh 11000 1726867161.09534: Set connection var ansible_pipelining to False 11000 1726867161.09555: Set connection var ansible_shell_executable to /bin/sh 11000 1726867161.09563: Set connection var ansible_connection to ssh 11000 1726867161.09580: Set connection var ansible_timeout to 10 11000 1726867161.09660: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867161.09663: variable 'ansible_shell_executable' from source: unknown 11000 1726867161.09665: variable 'ansible_connection' from source: unknown 11000 1726867161.09668: variable 'ansible_module_compression' from source: unknown 11000 1726867161.09670: variable 'ansible_shell_type' from source: unknown 11000 1726867161.09672: variable 'ansible_shell_executable' from source: unknown 11000 1726867161.09674: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.09676: variable 'ansible_pipelining' from source: unknown 11000 1726867161.09679: variable 'ansible_timeout' from source: unknown 11000 1726867161.09681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.09825: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867161.09843: variable 'omit' from source: magic vars 11000 1726867161.09854: starting attempt loop 11000 1726867161.09861: running the handler 11000 1726867161.09903: _low_level_execute_command(): starting 11000 1726867161.09909: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867161.10770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.10827: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867161.10847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.10881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.10967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.12666: stdout chunk (state=3): >>>/root <<< 11000 1726867161.12802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.12806: stdout chunk (state=3): >>><<< 11000 1726867161.12808: stderr chunk (state=3): >>><<< 11000 1726867161.12828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.12850: _low_level_execute_command(): starting 11000 1726867161.12862: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476 `" && echo ansible-tmp-1726867161.1283662-12060-138769746635476="` echo /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476 `" ) && sleep 0' 11000 1726867161.13537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867161.13561: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867161.13580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.13604: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.13681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.13734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867161.13752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.13786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.13866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.15726: stdout chunk (state=3): >>>ansible-tmp-1726867161.1283662-12060-138769746635476=/root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476 <<< 11000 1726867161.15896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.15899: stdout chunk (state=3): >>><<< 11000 1726867161.15901: stderr chunk (state=3): >>><<< 11000 1726867161.15917: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867161.1283662-12060-138769746635476=/root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.15955: variable 'ansible_module_compression' from source: unknown 11000 1726867161.16083: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867161.16086: variable 'ansible_facts' from source: unknown 11000 1726867161.16166: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/AnsiballZ_command.py 11000 1726867161.16413: Sending initial data 11000 1726867161.16424: Sent initial data (156 bytes) 11000 1726867161.17010: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.17025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.17095: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.17129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867161.17145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.17165: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.17245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.18756: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11000 1726867161.18790: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867161.18841: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867161.18924: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpvol46x5z /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/AnsiballZ_command.py <<< 11000 1726867161.18927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/AnsiballZ_command.py" <<< 11000 1726867161.18984: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpvol46x5z" to remote "/root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/AnsiballZ_command.py" <<< 11000 1726867161.19563: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.19678: stderr chunk (state=3): >>><<< 11000 1726867161.19682: stdout chunk (state=3): >>><<< 11000 1726867161.19684: done transferring module to remote 11000 1726867161.19686: _low_level_execute_command(): starting 11000 1726867161.19688: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/ /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/AnsiballZ_command.py && sleep 0' 11000 1726867161.20052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.20057: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867161.20059: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.20111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867161.20115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.20117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.20167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.21885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.21974: stderr chunk (state=3): >>><<< 11000 1726867161.21980: stdout chunk (state=3): >>><<< 11000 1726867161.21984: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.21986: _low_level_execute_command(): starting 11000 1726867161.21989: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/AnsiballZ_command.py && sleep 0' 11000 1726867161.22319: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.22335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867161.22346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.22389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.22402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.22465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.37781: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.26/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 17:19:21.372295", "end": "2024-09-20 17:19:21.375961", "delta": "0:00:00.003666", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867161.39259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867161.39296: stderr chunk (state=3): >>><<< 11000 1726867161.39300: stdout chunk (state=3): >>><<< 11000 1726867161.39319: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.26/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-20 17:19:21.372295", "end": "2024-09-20 17:19:21.375961", "delta": "0:00:00.003666", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867161.39347: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867161.39355: _low_level_execute_command(): starting 11000 1726867161.39359: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867161.1283662-12060-138769746635476/ > /dev/null 2>&1 && sleep 0' 11000 1726867161.39894: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.39898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.39912: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.39971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867161.39974: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.39980: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.40022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.41818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.41840: stderr chunk (state=3): >>><<< 11000 1726867161.41843: stdout chunk (state=3): >>><<< 11000 1726867161.41855: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.41861: handler run complete 11000 1726867161.41881: Evaluated conditional (False): False 11000 1726867161.41993: variable 'result' from source: set_fact 11000 1726867161.42006: Evaluated conditional ('192.0.2' in result.stdout): True 11000 1726867161.42017: attempt loop complete, returning result 11000 1726867161.42020: _execute() done 11000 1726867161.42023: dumping result to json 11000 1726867161.42032: done dumping result, returning 11000 1726867161.42036: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 [0affcac9-a3a5-c734-026a-000000000072] 11000 1726867161.42041: sending task result for task 0affcac9-a3a5-c734-026a-000000000072 11000 1726867161.42221: done sending task result for task 0affcac9-a3a5-c734-026a-000000000072 11000 1726867161.42230: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003666", "end": "2024-09-20 17:19:21.375961", "rc": 0, "start": "2024-09-20 17:19:21.372295" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.26/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond valid_lft 236sec preferred_lft 236sec 11000 1726867161.42340: no more pending results, returning what we have 11000 1726867161.42343: results queue empty 11000 1726867161.42344: checking for any_errors_fatal 11000 1726867161.42350: done checking for any_errors_fatal 11000 1726867161.42351: checking for max_fail_percentage 11000 1726867161.42352: done checking for max_fail_percentage 11000 1726867161.42353: checking to see if all hosts have failed and the running result is not ok 11000 1726867161.42354: done checking to see if all hosts have failed 11000 1726867161.42354: getting the remaining hosts for this loop 11000 1726867161.42357: done getting the remaining hosts for this loop 11000 1726867161.42360: getting the next task for host managed_node1 11000 1726867161.42369: done getting next task for host managed_node1 11000 1726867161.42373: ^ task is: TASK: ** TEST check IPv6 11000 1726867161.42375: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867161.42382: getting variables 11000 1726867161.42383: in VariableManager get_vars() 11000 1726867161.42425: Calling all_inventory to load vars for managed_node1 11000 1726867161.42430: Calling groups_inventory to load vars for managed_node1 11000 1726867161.42437: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867161.42474: Calling all_plugins_play to load vars for managed_node1 11000 1726867161.42479: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867161.42483: Calling groups_plugins_play to load vars for managed_node1 11000 1726867161.44653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867161.46724: done with get_vars() 11000 1726867161.46755: done getting variables 11000 1726867161.46815: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 Friday 20 September 2024 17:19:21 -0400 (0:00:00.393) 0:00:23.111 ****** 11000 1726867161.46850: entering _queue_task() for managed_node1/command 11000 1726867161.47224: worker is 1 (out of 1 available) 11000 1726867161.47237: exiting _queue_task() for managed_node1/command 11000 1726867161.47250: done queuing things up, now waiting for results queue to drain 11000 1726867161.47251: waiting for pending results... 11000 1726867161.47682: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 11000 1726867161.47691: in run() - task 0affcac9-a3a5-c734-026a-000000000073 11000 1726867161.47695: variable 'ansible_search_path' from source: unknown 11000 1726867161.47932: calling self._execute() 11000 1726867161.48109: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.48170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.48192: variable 'omit' from source: magic vars 11000 1726867161.48834: variable 'ansible_distribution_major_version' from source: facts 11000 1726867161.48838: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867161.48841: variable 'omit' from source: magic vars 11000 1726867161.48861: variable 'omit' from source: magic vars 11000 1726867161.49283: variable 'controller_device' from source: play vars 11000 1726867161.49289: variable 'omit' from source: magic vars 11000 1726867161.49455: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867161.49596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867161.49599: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867161.49601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867161.49631: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867161.49866: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867161.49878: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.49919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.50245: Set connection var ansible_shell_type to sh 11000 1726867161.50248: Set connection var ansible_pipelining to False 11000 1726867161.50253: Set connection var ansible_shell_executable to /bin/sh 11000 1726867161.50256: Set connection var ansible_connection to ssh 11000 1726867161.50264: Set connection var ansible_timeout to 10 11000 1726867161.50381: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867161.50498: variable 'ansible_shell_executable' from source: unknown 11000 1726867161.50501: variable 'ansible_connection' from source: unknown 11000 1726867161.50575: variable 'ansible_module_compression' from source: unknown 11000 1726867161.50580: variable 'ansible_shell_type' from source: unknown 11000 1726867161.50583: variable 'ansible_shell_executable' from source: unknown 11000 1726867161.50585: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.50594: variable 'ansible_pipelining' from source: unknown 11000 1726867161.50597: variable 'ansible_timeout' from source: unknown 11000 1726867161.50599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.50850: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867161.50912: variable 'omit' from source: magic vars 11000 1726867161.50919: starting attempt loop 11000 1726867161.50921: running the handler 11000 1726867161.51001: _low_level_execute_command(): starting 11000 1726867161.51004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867161.51851: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.51885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867161.51928: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.51953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.51998: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.52059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.53681: stdout chunk (state=3): >>>/root <<< 11000 1726867161.53779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.53837: stderr chunk (state=3): >>><<< 11000 1726867161.53840: stdout chunk (state=3): >>><<< 11000 1726867161.53975: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.54042: _low_level_execute_command(): starting 11000 1726867161.54046: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645 `" && echo ansible-tmp-1726867161.5388768-12077-260525605343645="` echo /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645 `" ) && sleep 0' 11000 1726867161.54558: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.54562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.54566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.54648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.54709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.56639: stdout chunk (state=3): >>>ansible-tmp-1726867161.5388768-12077-260525605343645=/root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645 <<< 11000 1726867161.56811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.56815: stderr chunk (state=3): >>><<< 11000 1726867161.56818: stdout chunk (state=3): >>><<< 11000 1726867161.56820: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867161.5388768-12077-260525605343645=/root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.56822: variable 'ansible_module_compression' from source: unknown 11000 1726867161.56871: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867161.56919: variable 'ansible_facts' from source: unknown 11000 1726867161.57055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/AnsiballZ_command.py 11000 1726867161.57316: Sending initial data 11000 1726867161.57324: Sent initial data (156 bytes) 11000 1726867161.57781: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.57789: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.57813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.57816: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.57819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.57882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.57945: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.59493: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867161.59558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867161.59609: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmp6d6q4rhi /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/AnsiballZ_command.py <<< 11000 1726867161.59614: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/AnsiballZ_command.py" <<< 11000 1726867161.59660: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmp6d6q4rhi" to remote "/root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/AnsiballZ_command.py" <<< 11000 1726867161.60536: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.60626: stderr chunk (state=3): >>><<< 11000 1726867161.60649: stdout chunk (state=3): >>><<< 11000 1726867161.60662: done transferring module to remote 11000 1726867161.60669: _low_level_execute_command(): starting 11000 1726867161.60693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/ /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/AnsiballZ_command.py && sleep 0' 11000 1726867161.61229: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867161.61236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.61240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.61242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867161.61244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.61246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.61289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867161.61294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.61296: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.61338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.63058: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.63100: stderr chunk (state=3): >>><<< 11000 1726867161.63103: stdout chunk (state=3): >>><<< 11000 1726867161.63106: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.63109: _low_level_execute_command(): starting 11000 1726867161.63111: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/AnsiballZ_command.py && sleep 0' 11000 1726867161.63641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867161.63646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.63648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.63650: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867161.63653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.63707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.63712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.63767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.79097: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::19/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::f4e5:e8ff:fef7:1b94/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::f4e5:e8ff:fef7:1b94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 17:19:21.785626", "end": "2024-09-20 17:19:21.789198", "delta": "0:00:00.003572", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867161.80622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867161.80645: stderr chunk (state=3): >>><<< 11000 1726867161.80648: stdout chunk (state=3): >>><<< 11000 1726867161.80663: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::19/128 scope global dynamic noprefixroute \n valid_lft 237sec preferred_lft 237sec\n inet6 2001:db8::f4e5:e8ff:fef7:1b94/64 scope global dynamic noprefixroute \n valid_lft 1797sec preferred_lft 1797sec\n inet6 fe80::f4e5:e8ff:fef7:1b94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-20 17:19:21.785626", "end": "2024-09-20 17:19:21.789198", "delta": "0:00:00.003572", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867161.80696: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867161.80704: _low_level_execute_command(): starting 11000 1726867161.80709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867161.5388768-12077-260525605343645/ > /dev/null 2>&1 && sleep 0' 11000 1726867161.81141: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867161.81144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.81153: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867161.81155: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867161.81157: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867161.81204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867161.81208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867161.81257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867161.83317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867161.83320: stderr chunk (state=3): >>><<< 11000 1726867161.83322: stdout chunk (state=3): >>><<< 11000 1726867161.83325: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867161.83327: handler run complete 11000 1726867161.83329: Evaluated conditional (False): False 11000 1726867161.83334: variable 'result' from source: set_fact 11000 1726867161.83353: Evaluated conditional ('2001' in result.stdout): True 11000 1726867161.83366: attempt loop complete, returning result 11000 1726867161.83369: _execute() done 11000 1726867161.83371: dumping result to json 11000 1726867161.83379: done dumping result, returning 11000 1726867161.83392: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 [0affcac9-a3a5-c734-026a-000000000073] 11000 1726867161.83402: sending task result for task 0affcac9-a3a5-c734-026a-000000000073 ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003572", "end": "2024-09-20 17:19:21.789198", "rc": 0, "start": "2024-09-20 17:19:21.785626" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::19/128 scope global dynamic noprefixroute valid_lft 237sec preferred_lft 237sec inet6 2001:db8::f4e5:e8ff:fef7:1b94/64 scope global dynamic noprefixroute valid_lft 1797sec preferred_lft 1797sec inet6 fe80::f4e5:e8ff:fef7:1b94/64 scope link noprefixroute valid_lft forever preferred_lft forever 11000 1726867161.83604: no more pending results, returning what we have 11000 1726867161.83608: results queue empty 11000 1726867161.83609: checking for any_errors_fatal 11000 1726867161.83618: done checking for any_errors_fatal 11000 1726867161.83618: checking for max_fail_percentage 11000 1726867161.83620: done checking for max_fail_percentage 11000 1726867161.83621: checking to see if all hosts have failed and the running result is not ok 11000 1726867161.83622: done checking to see if all hosts have failed 11000 1726867161.83623: getting the remaining hosts for this loop 11000 1726867161.83625: done getting the remaining hosts for this loop 11000 1726867161.83629: getting the next task for host managed_node1 11000 1726867161.83643: done getting next task for host managed_node1 11000 1726867161.83651: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11000 1726867161.83654: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867161.83883: getting variables 11000 1726867161.83886: in VariableManager get_vars() 11000 1726867161.83927: Calling all_inventory to load vars for managed_node1 11000 1726867161.83930: Calling groups_inventory to load vars for managed_node1 11000 1726867161.83933: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867161.83943: Calling all_plugins_play to load vars for managed_node1 11000 1726867161.83946: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867161.83951: Calling groups_plugins_play to load vars for managed_node1 11000 1726867161.84482: done sending task result for task 0affcac9-a3a5-c734-026a-000000000073 11000 1726867161.84486: WORKER PROCESS EXITING 11000 1726867161.85421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867161.86996: done with get_vars() 11000 1726867161.87020: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 17:19:21 -0400 (0:00:00.402) 0:00:23.514 ****** 11000 1726867161.87127: entering _queue_task() for managed_node1/include_tasks 11000 1726867161.87442: worker is 1 (out of 1 available) 11000 1726867161.87456: exiting _queue_task() for managed_node1/include_tasks 11000 1726867161.87469: done queuing things up, now waiting for results queue to drain 11000 1726867161.87470: waiting for pending results... 11000 1726867161.87747: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11000 1726867161.87910: in run() - task 0affcac9-a3a5-c734-026a-00000000007d 11000 1726867161.88083: variable 'ansible_search_path' from source: unknown 11000 1726867161.88086: variable 'ansible_search_path' from source: unknown 11000 1726867161.88091: calling self._execute() 11000 1726867161.88094: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.88097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.88105: variable 'omit' from source: magic vars 11000 1726867161.88479: variable 'ansible_distribution_major_version' from source: facts 11000 1726867161.88501: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867161.88513: _execute() done 11000 1726867161.88521: dumping result to json 11000 1726867161.88528: done dumping result, returning 11000 1726867161.88545: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcac9-a3a5-c734-026a-00000000007d] 11000 1726867161.88555: sending task result for task 0affcac9-a3a5-c734-026a-00000000007d 11000 1726867161.88800: done sending task result for task 0affcac9-a3a5-c734-026a-00000000007d 11000 1726867161.88803: WORKER PROCESS EXITING 11000 1726867161.88843: no more pending results, returning what we have 11000 1726867161.88847: in VariableManager get_vars() 11000 1726867161.88895: Calling all_inventory to load vars for managed_node1 11000 1726867161.88898: Calling groups_inventory to load vars for managed_node1 11000 1726867161.88901: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867161.88913: Calling all_plugins_play to load vars for managed_node1 11000 1726867161.88916: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867161.88919: Calling groups_plugins_play to load vars for managed_node1 11000 1726867161.90463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867161.91955: done with get_vars() 11000 1726867161.91973: variable 'ansible_search_path' from source: unknown 11000 1726867161.91974: variable 'ansible_search_path' from source: unknown 11000 1726867161.92018: we have included files to process 11000 1726867161.92019: generating all_blocks data 11000 1726867161.92021: done generating all_blocks data 11000 1726867161.92026: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11000 1726867161.92027: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11000 1726867161.92029: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11000 1726867161.92538: done processing included file 11000 1726867161.92540: iterating over new_blocks loaded from include file 11000 1726867161.92541: in VariableManager get_vars() 11000 1726867161.92558: done with get_vars() 11000 1726867161.92559: filtering new block on tags 11000 1726867161.92579: done filtering new block on tags 11000 1726867161.92581: in VariableManager get_vars() 11000 1726867161.92598: done with get_vars() 11000 1726867161.92599: filtering new block on tags 11000 1726867161.92624: done filtering new block on tags 11000 1726867161.92626: in VariableManager get_vars() 11000 1726867161.92641: done with get_vars() 11000 1726867161.92642: filtering new block on tags 11000 1726867161.92665: done filtering new block on tags 11000 1726867161.92666: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11000 1726867161.92670: extending task lists for all hosts with included blocks 11000 1726867161.93256: done extending task lists 11000 1726867161.93257: done processing included files 11000 1726867161.93257: results queue empty 11000 1726867161.93258: checking for any_errors_fatal 11000 1726867161.93261: done checking for any_errors_fatal 11000 1726867161.93261: checking for max_fail_percentage 11000 1726867161.93262: done checking for max_fail_percentage 11000 1726867161.93262: checking to see if all hosts have failed and the running result is not ok 11000 1726867161.93263: done checking to see if all hosts have failed 11000 1726867161.93263: getting the remaining hosts for this loop 11000 1726867161.93264: done getting the remaining hosts for this loop 11000 1726867161.93266: getting the next task for host managed_node1 11000 1726867161.93269: done getting next task for host managed_node1 11000 1726867161.93270: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11000 1726867161.93273: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867161.93282: getting variables 11000 1726867161.93283: in VariableManager get_vars() 11000 1726867161.93297: Calling all_inventory to load vars for managed_node1 11000 1726867161.93299: Calling groups_inventory to load vars for managed_node1 11000 1726867161.93300: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867161.93304: Calling all_plugins_play to load vars for managed_node1 11000 1726867161.93306: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867161.93307: Calling groups_plugins_play to load vars for managed_node1 11000 1726867161.93914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867161.95272: done with get_vars() 11000 1726867161.95289: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 17:19:21 -0400 (0:00:00.082) 0:00:23.596 ****** 11000 1726867161.95337: entering _queue_task() for managed_node1/setup 11000 1726867161.95583: worker is 1 (out of 1 available) 11000 1726867161.95598: exiting _queue_task() for managed_node1/setup 11000 1726867161.95609: done queuing things up, now waiting for results queue to drain 11000 1726867161.95611: waiting for pending results... 11000 1726867161.95780: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11000 1726867161.95901: in run() - task 0affcac9-a3a5-c734-026a-000000000494 11000 1726867161.95915: variable 'ansible_search_path' from source: unknown 11000 1726867161.95918: variable 'ansible_search_path' from source: unknown 11000 1726867161.95945: calling self._execute() 11000 1726867161.96028: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867161.96032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867161.96042: variable 'omit' from source: magic vars 11000 1726867161.96310: variable 'ansible_distribution_major_version' from source: facts 11000 1726867161.96320: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867161.96460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867161.98385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867161.98408: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867161.98455: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867161.98499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867161.98519: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867161.98576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867161.98610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867161.98629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867161.98654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867161.98665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867161.98709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867161.98727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867161.98743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867161.98767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867161.98779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867161.98889: variable '__network_required_facts' from source: role '' defaults 11000 1726867161.98900: variable 'ansible_facts' from source: unknown 11000 1726867161.99335: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11000 1726867161.99339: when evaluation is False, skipping this task 11000 1726867161.99342: _execute() done 11000 1726867161.99344: dumping result to json 11000 1726867161.99348: done dumping result, returning 11000 1726867161.99352: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcac9-a3a5-c734-026a-000000000494] 11000 1726867161.99362: sending task result for task 0affcac9-a3a5-c734-026a-000000000494 11000 1726867161.99437: done sending task result for task 0affcac9-a3a5-c734-026a-000000000494 11000 1726867161.99440: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867161.99513: no more pending results, returning what we have 11000 1726867161.99516: results queue empty 11000 1726867161.99517: checking for any_errors_fatal 11000 1726867161.99518: done checking for any_errors_fatal 11000 1726867161.99519: checking for max_fail_percentage 11000 1726867161.99520: done checking for max_fail_percentage 11000 1726867161.99521: checking to see if all hosts have failed and the running result is not ok 11000 1726867161.99522: done checking to see if all hosts have failed 11000 1726867161.99522: getting the remaining hosts for this loop 11000 1726867161.99524: done getting the remaining hosts for this loop 11000 1726867161.99527: getting the next task for host managed_node1 11000 1726867161.99536: done getting next task for host managed_node1 11000 1726867161.99539: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11000 1726867161.99543: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867161.99559: getting variables 11000 1726867161.99560: in VariableManager get_vars() 11000 1726867161.99598: Calling all_inventory to load vars for managed_node1 11000 1726867161.99601: Calling groups_inventory to load vars for managed_node1 11000 1726867161.99602: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867161.99610: Calling all_plugins_play to load vars for managed_node1 11000 1726867161.99613: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867161.99615: Calling groups_plugins_play to load vars for managed_node1 11000 1726867162.00644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867162.02681: done with get_vars() 11000 1726867162.02716: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 17:19:22 -0400 (0:00:00.075) 0:00:23.671 ****** 11000 1726867162.02892: entering _queue_task() for managed_node1/stat 11000 1726867162.03221: worker is 1 (out of 1 available) 11000 1726867162.03235: exiting _queue_task() for managed_node1/stat 11000 1726867162.03247: done queuing things up, now waiting for results queue to drain 11000 1726867162.03249: waiting for pending results... 11000 1726867162.03484: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11000 1726867162.03583: in run() - task 0affcac9-a3a5-c734-026a-000000000496 11000 1726867162.03601: variable 'ansible_search_path' from source: unknown 11000 1726867162.03604: variable 'ansible_search_path' from source: unknown 11000 1726867162.03633: calling self._execute() 11000 1726867162.03882: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867162.03886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867162.03890: variable 'omit' from source: magic vars 11000 1726867162.04175: variable 'ansible_distribution_major_version' from source: facts 11000 1726867162.04312: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867162.04645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867162.05109: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867162.05329: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867162.05390: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867162.05428: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867162.05512: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867162.05661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867162.05715: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867162.05765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867162.05853: variable '__network_is_ostree' from source: set_fact 11000 1726867162.05866: Evaluated conditional (not __network_is_ostree is defined): False 11000 1726867162.05874: when evaluation is False, skipping this task 11000 1726867162.05885: _execute() done 11000 1726867162.05901: dumping result to json 11000 1726867162.05912: done dumping result, returning 11000 1726867162.05923: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcac9-a3a5-c734-026a-000000000496] 11000 1726867162.05932: sending task result for task 0affcac9-a3a5-c734-026a-000000000496 skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11000 1726867162.06089: no more pending results, returning what we have 11000 1726867162.06092: results queue empty 11000 1726867162.06093: checking for any_errors_fatal 11000 1726867162.06107: done checking for any_errors_fatal 11000 1726867162.06108: checking for max_fail_percentage 11000 1726867162.06110: done checking for max_fail_percentage 11000 1726867162.06111: checking to see if all hosts have failed and the running result is not ok 11000 1726867162.06112: done checking to see if all hosts have failed 11000 1726867162.06112: getting the remaining hosts for this loop 11000 1726867162.06114: done getting the remaining hosts for this loop 11000 1726867162.06117: getting the next task for host managed_node1 11000 1726867162.06124: done getting next task for host managed_node1 11000 1726867162.06127: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11000 1726867162.06132: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867162.06148: getting variables 11000 1726867162.06149: in VariableManager get_vars() 11000 1726867162.06189: Calling all_inventory to load vars for managed_node1 11000 1726867162.06192: Calling groups_inventory to load vars for managed_node1 11000 1726867162.06195: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867162.06208: Calling all_plugins_play to load vars for managed_node1 11000 1726867162.06225: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867162.06231: done sending task result for task 0affcac9-a3a5-c734-026a-000000000496 11000 1726867162.06234: WORKER PROCESS EXITING 11000 1726867162.06243: Calling groups_plugins_play to load vars for managed_node1 11000 1726867162.07160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867162.08088: done with get_vars() 11000 1726867162.08108: done getting variables 11000 1726867162.08162: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 17:19:22 -0400 (0:00:00.053) 0:00:23.725 ****** 11000 1726867162.08201: entering _queue_task() for managed_node1/set_fact 11000 1726867162.08517: worker is 1 (out of 1 available) 11000 1726867162.08530: exiting _queue_task() for managed_node1/set_fact 11000 1726867162.08542: done queuing things up, now waiting for results queue to drain 11000 1726867162.08543: waiting for pending results... 11000 1726867162.08761: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11000 1726867162.08948: in run() - task 0affcac9-a3a5-c734-026a-000000000497 11000 1726867162.08969: variable 'ansible_search_path' from source: unknown 11000 1726867162.08975: variable 'ansible_search_path' from source: unknown 11000 1726867162.09023: calling self._execute() 11000 1726867162.09121: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867162.09140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867162.09155: variable 'omit' from source: magic vars 11000 1726867162.09796: variable 'ansible_distribution_major_version' from source: facts 11000 1726867162.09800: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867162.09919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867162.10193: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867162.10245: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867162.10287: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867162.10330: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867162.10418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867162.10454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867162.10556: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867162.10560: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867162.10613: variable '__network_is_ostree' from source: set_fact 11000 1726867162.10625: Evaluated conditional (not __network_is_ostree is defined): False 11000 1726867162.10632: when evaluation is False, skipping this task 11000 1726867162.10638: _execute() done 11000 1726867162.10644: dumping result to json 11000 1726867162.10651: done dumping result, returning 11000 1726867162.10667: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcac9-a3a5-c734-026a-000000000497] 11000 1726867162.10676: sending task result for task 0affcac9-a3a5-c734-026a-000000000497 11000 1726867162.10889: done sending task result for task 0affcac9-a3a5-c734-026a-000000000497 11000 1726867162.10893: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11000 1726867162.10938: no more pending results, returning what we have 11000 1726867162.10941: results queue empty 11000 1726867162.10942: checking for any_errors_fatal 11000 1726867162.10947: done checking for any_errors_fatal 11000 1726867162.10948: checking for max_fail_percentage 11000 1726867162.10949: done checking for max_fail_percentage 11000 1726867162.10950: checking to see if all hosts have failed and the running result is not ok 11000 1726867162.10951: done checking to see if all hosts have failed 11000 1726867162.10951: getting the remaining hosts for this loop 11000 1726867162.10953: done getting the remaining hosts for this loop 11000 1726867162.10956: getting the next task for host managed_node1 11000 1726867162.10965: done getting next task for host managed_node1 11000 1726867162.10968: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11000 1726867162.10972: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867162.10988: getting variables 11000 1726867162.10989: in VariableManager get_vars() 11000 1726867162.11020: Calling all_inventory to load vars for managed_node1 11000 1726867162.11022: Calling groups_inventory to load vars for managed_node1 11000 1726867162.11024: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867162.11032: Calling all_plugins_play to load vars for managed_node1 11000 1726867162.11034: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867162.11036: Calling groups_plugins_play to load vars for managed_node1 11000 1726867162.12376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867162.13914: done with get_vars() 11000 1726867162.13936: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 17:19:22 -0400 (0:00:00.058) 0:00:23.783 ****** 11000 1726867162.14028: entering _queue_task() for managed_node1/service_facts 11000 1726867162.14298: worker is 1 (out of 1 available) 11000 1726867162.14310: exiting _queue_task() for managed_node1/service_facts 11000 1726867162.14322: done queuing things up, now waiting for results queue to drain 11000 1726867162.14323: waiting for pending results... 11000 1726867162.14603: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11000 1726867162.14779: in run() - task 0affcac9-a3a5-c734-026a-000000000499 11000 1726867162.14807: variable 'ansible_search_path' from source: unknown 11000 1726867162.14816: variable 'ansible_search_path' from source: unknown 11000 1726867162.14880: calling self._execute() 11000 1726867162.14998: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867162.15012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867162.15185: variable 'omit' from source: magic vars 11000 1726867162.15405: variable 'ansible_distribution_major_version' from source: facts 11000 1726867162.15425: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867162.15436: variable 'omit' from source: magic vars 11000 1726867162.15523: variable 'omit' from source: magic vars 11000 1726867162.15563: variable 'omit' from source: magic vars 11000 1726867162.15608: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867162.15652: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867162.15680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867162.15703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867162.15720: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867162.15758: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867162.15768: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867162.15845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867162.15887: Set connection var ansible_shell_type to sh 11000 1726867162.15901: Set connection var ansible_pipelining to False 11000 1726867162.15916: Set connection var ansible_shell_executable to /bin/sh 11000 1726867162.15924: Set connection var ansible_connection to ssh 11000 1726867162.15935: Set connection var ansible_timeout to 10 11000 1726867162.15945: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867162.15981: variable 'ansible_shell_executable' from source: unknown 11000 1726867162.15991: variable 'ansible_connection' from source: unknown 11000 1726867162.16000: variable 'ansible_module_compression' from source: unknown 11000 1726867162.16007: variable 'ansible_shell_type' from source: unknown 11000 1726867162.16014: variable 'ansible_shell_executable' from source: unknown 11000 1726867162.16020: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867162.16027: variable 'ansible_pipelining' from source: unknown 11000 1726867162.16034: variable 'ansible_timeout' from source: unknown 11000 1726867162.16062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867162.16243: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867162.16261: variable 'omit' from source: magic vars 11000 1726867162.16281: starting attempt loop 11000 1726867162.16284: running the handler 11000 1726867162.16385: _low_level_execute_command(): starting 11000 1726867162.16388: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867162.17011: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867162.17047: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867162.17096: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867162.17156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867162.17170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867162.17195: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867162.17275: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867162.18974: stdout chunk (state=3): >>>/root <<< 11000 1726867162.19074: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867162.19136: stderr chunk (state=3): >>><<< 11000 1726867162.19163: stdout chunk (state=3): >>><<< 11000 1726867162.19186: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867162.19195: _low_level_execute_command(): starting 11000 1726867162.19202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151 `" && echo ansible-tmp-1726867162.1918087-12108-232096395735151="` echo /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151 `" ) && sleep 0' 11000 1726867162.19617: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867162.19621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867162.19624: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867162.19634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867162.19673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867162.19678: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867162.19730: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867162.21600: stdout chunk (state=3): >>>ansible-tmp-1726867162.1918087-12108-232096395735151=/root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151 <<< 11000 1726867162.21713: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867162.21739: stderr chunk (state=3): >>><<< 11000 1726867162.21741: stdout chunk (state=3): >>><<< 11000 1726867162.21749: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867162.1918087-12108-232096395735151=/root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867162.21807: variable 'ansible_module_compression' from source: unknown 11000 1726867162.21825: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11000 1726867162.21854: variable 'ansible_facts' from source: unknown 11000 1726867162.21914: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/AnsiballZ_service_facts.py 11000 1726867162.22004: Sending initial data 11000 1726867162.22007: Sent initial data (162 bytes) 11000 1726867162.22511: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867162.22591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867162.22618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867162.22701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867162.24272: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867162.24305: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867162.24380: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpygwh53_t /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/AnsiballZ_service_facts.py <<< 11000 1726867162.24392: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/AnsiballZ_service_facts.py" <<< 11000 1726867162.24431: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpygwh53_t" to remote "/root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/AnsiballZ_service_facts.py" <<< 11000 1726867162.25067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867162.25236: stderr chunk (state=3): >>><<< 11000 1726867162.25239: stdout chunk (state=3): >>><<< 11000 1726867162.25241: done transferring module to remote 11000 1726867162.25243: _low_level_execute_command(): starting 11000 1726867162.25245: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/ /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/AnsiballZ_service_facts.py && sleep 0' 11000 1726867162.25925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867162.25933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867162.25937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867162.25940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867162.25942: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867162.25944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867162.25946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867162.26008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867162.26011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867162.26050: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867162.26149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867162.28152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867162.28155: stdout chunk (state=3): >>><<< 11000 1726867162.28158: stderr chunk (state=3): >>><<< 11000 1726867162.28160: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867162.28163: _low_level_execute_command(): starting 11000 1726867162.28165: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/AnsiballZ_service_facts.py && sleep 0' 11000 1726867162.28952: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867162.28998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867162.29025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867162.29059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867162.29079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867162.29095: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867162.29110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867162.29130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867162.29222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867162.29252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867162.29327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867163.80862: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source":<<< 11000 1726867163.80885: stdout chunk (state=3): >>> "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 11000 1726867163.80912: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "<<< 11000 1726867163.80923: stdout chunk (state=3): >>>inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 11000 1726867163.80941: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11000 1726867163.82390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867163.82423: stderr chunk (state=3): >>><<< 11000 1726867163.82427: stdout chunk (state=3): >>><<< 11000 1726867163.82461: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867163.83127: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867163.83134: _low_level_execute_command(): starting 11000 1726867163.83138: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867162.1918087-12108-232096395735151/ > /dev/null 2>&1 && sleep 0' 11000 1726867163.83567: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867163.83602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867163.83605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867163.83607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867163.83610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.83663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867163.83670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867163.83672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867163.83714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867163.85497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867163.85524: stderr chunk (state=3): >>><<< 11000 1726867163.85528: stdout chunk (state=3): >>><<< 11000 1726867163.85539: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867163.85546: handler run complete 11000 1726867163.85652: variable 'ansible_facts' from source: unknown 11000 1726867163.85742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867163.86012: variable 'ansible_facts' from source: unknown 11000 1726867163.86096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867163.86204: attempt loop complete, returning result 11000 1726867163.86207: _execute() done 11000 1726867163.86209: dumping result to json 11000 1726867163.86245: done dumping result, returning 11000 1726867163.86252: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [0affcac9-a3a5-c734-026a-000000000499] 11000 1726867163.86256: sending task result for task 0affcac9-a3a5-c734-026a-000000000499 11000 1726867163.86944: done sending task result for task 0affcac9-a3a5-c734-026a-000000000499 11000 1726867163.86947: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867163.86999: no more pending results, returning what we have 11000 1726867163.87000: results queue empty 11000 1726867163.87001: checking for any_errors_fatal 11000 1726867163.87003: done checking for any_errors_fatal 11000 1726867163.87004: checking for max_fail_percentage 11000 1726867163.87004: done checking for max_fail_percentage 11000 1726867163.87006: checking to see if all hosts have failed and the running result is not ok 11000 1726867163.87007: done checking to see if all hosts have failed 11000 1726867163.87008: getting the remaining hosts for this loop 11000 1726867163.87009: done getting the remaining hosts for this loop 11000 1726867163.87012: getting the next task for host managed_node1 11000 1726867163.87016: done getting next task for host managed_node1 11000 1726867163.87019: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11000 1726867163.87022: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867163.87029: getting variables 11000 1726867163.87029: in VariableManager get_vars() 11000 1726867163.87050: Calling all_inventory to load vars for managed_node1 11000 1726867163.87052: Calling groups_inventory to load vars for managed_node1 11000 1726867163.87054: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867163.87060: Calling all_plugins_play to load vars for managed_node1 11000 1726867163.87061: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867163.87063: Calling groups_plugins_play to load vars for managed_node1 11000 1726867163.87717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867163.88575: done with get_vars() 11000 1726867163.88595: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 17:19:23 -0400 (0:00:01.746) 0:00:25.529 ****** 11000 1726867163.88662: entering _queue_task() for managed_node1/package_facts 11000 1726867163.88875: worker is 1 (out of 1 available) 11000 1726867163.88888: exiting _queue_task() for managed_node1/package_facts 11000 1726867163.88900: done queuing things up, now waiting for results queue to drain 11000 1726867163.88901: waiting for pending results... 11000 1726867163.89074: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11000 1726867163.89183: in run() - task 0affcac9-a3a5-c734-026a-00000000049a 11000 1726867163.89198: variable 'ansible_search_path' from source: unknown 11000 1726867163.89202: variable 'ansible_search_path' from source: unknown 11000 1726867163.89227: calling self._execute() 11000 1726867163.89304: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867163.89308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867163.89316: variable 'omit' from source: magic vars 11000 1726867163.89602: variable 'ansible_distribution_major_version' from source: facts 11000 1726867163.89611: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867163.89617: variable 'omit' from source: magic vars 11000 1726867163.89673: variable 'omit' from source: magic vars 11000 1726867163.89699: variable 'omit' from source: magic vars 11000 1726867163.89728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867163.89753: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867163.89768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867163.89786: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867163.89798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867163.89820: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867163.89824: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867163.89826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867163.89892: Set connection var ansible_shell_type to sh 11000 1726867163.89901: Set connection var ansible_pipelining to False 11000 1726867163.89908: Set connection var ansible_shell_executable to /bin/sh 11000 1726867163.89911: Set connection var ansible_connection to ssh 11000 1726867163.89916: Set connection var ansible_timeout to 10 11000 1726867163.89921: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867163.89940: variable 'ansible_shell_executable' from source: unknown 11000 1726867163.89942: variable 'ansible_connection' from source: unknown 11000 1726867163.89945: variable 'ansible_module_compression' from source: unknown 11000 1726867163.89948: variable 'ansible_shell_type' from source: unknown 11000 1726867163.89950: variable 'ansible_shell_executable' from source: unknown 11000 1726867163.89952: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867163.89955: variable 'ansible_pipelining' from source: unknown 11000 1726867163.89957: variable 'ansible_timeout' from source: unknown 11000 1726867163.89961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867163.90105: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867163.90116: variable 'omit' from source: magic vars 11000 1726867163.90119: starting attempt loop 11000 1726867163.90123: running the handler 11000 1726867163.90133: _low_level_execute_command(): starting 11000 1726867163.90140: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867163.90636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867163.90641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.90644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867163.90647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.90704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867163.90710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867163.90712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867163.90754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867163.92335: stdout chunk (state=3): >>>/root <<< 11000 1726867163.92437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867163.92461: stderr chunk (state=3): >>><<< 11000 1726867163.92464: stdout chunk (state=3): >>><<< 11000 1726867163.92487: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867163.92498: _low_level_execute_command(): starting 11000 1726867163.92503: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180 `" && echo ansible-tmp-1726867163.924864-12178-271190998776180="` echo /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180 `" ) && sleep 0' 11000 1726867163.92935: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867163.92938: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867163.92941: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.92950: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867163.92953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.92994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867163.92998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867163.93002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867163.93052: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867163.94889: stdout chunk (state=3): >>>ansible-tmp-1726867163.924864-12178-271190998776180=/root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180 <<< 11000 1726867163.95003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867163.95024: stderr chunk (state=3): >>><<< 11000 1726867163.95027: stdout chunk (state=3): >>><<< 11000 1726867163.95038: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867163.924864-12178-271190998776180=/root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867163.95072: variable 'ansible_module_compression' from source: unknown 11000 1726867163.95113: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11000 1726867163.95162: variable 'ansible_facts' from source: unknown 11000 1726867163.95276: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/AnsiballZ_package_facts.py 11000 1726867163.95369: Sending initial data 11000 1726867163.95372: Sent initial data (161 bytes) 11000 1726867163.95795: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867163.95798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867163.95800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.95802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867163.95804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.95851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867163.95858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867163.95903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867163.97420: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11000 1726867163.97424: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867163.97462: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867163.97509: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpzinwn127 /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/AnsiballZ_package_facts.py <<< 11000 1726867163.97512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/AnsiballZ_package_facts.py" <<< 11000 1726867163.97552: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpzinwn127" to remote "/root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/AnsiballZ_package_facts.py" <<< 11000 1726867163.98611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867163.98642: stderr chunk (state=3): >>><<< 11000 1726867163.98645: stdout chunk (state=3): >>><<< 11000 1726867163.98684: done transferring module to remote 11000 1726867163.98693: _low_level_execute_command(): starting 11000 1726867163.98696: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/ /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/AnsiballZ_package_facts.py && sleep 0' 11000 1726867163.99107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867163.99110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.99112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867163.99114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867163.99119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867163.99164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867163.99167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867163.99216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867164.00941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867164.00963: stderr chunk (state=3): >>><<< 11000 1726867164.00967: stdout chunk (state=3): >>><<< 11000 1726867164.00976: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867164.00981: _low_level_execute_command(): starting 11000 1726867164.00984: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/AnsiballZ_package_facts.py && sleep 0' 11000 1726867164.01373: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867164.01379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867164.01381: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867164.01384: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867164.01387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867164.01433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867164.01437: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867164.01490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867164.45667: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11000 1726867164.47356: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867164.47360: stdout chunk (state=3): >>><<< 11000 1726867164.47362: stderr chunk (state=3): >>><<< 11000 1726867164.47685: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867164.51956: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867164.52280: _low_level_execute_command(): starting 11000 1726867164.52284: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867163.924864-12178-271190998776180/ > /dev/null 2>&1 && sleep 0' 11000 1726867164.53234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867164.53585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867164.53604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867164.53627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867164.53911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867164.55766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867164.55776: stdout chunk (state=3): >>><<< 11000 1726867164.55794: stderr chunk (state=3): >>><<< 11000 1726867164.55814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867164.55826: handler run complete 11000 1726867164.57469: variable 'ansible_facts' from source: unknown 11000 1726867164.58783: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867164.62184: variable 'ansible_facts' from source: unknown 11000 1726867164.62923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867164.63642: attempt loop complete, returning result 11000 1726867164.63660: _execute() done 11000 1726867164.63668: dumping result to json 11000 1726867164.63883: done dumping result, returning 11000 1726867164.63902: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcac9-a3a5-c734-026a-00000000049a] 11000 1726867164.63912: sending task result for task 0affcac9-a3a5-c734-026a-00000000049a 11000 1726867164.66550: done sending task result for task 0affcac9-a3a5-c734-026a-00000000049a 11000 1726867164.66554: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867164.66708: no more pending results, returning what we have 11000 1726867164.66710: results queue empty 11000 1726867164.66711: checking for any_errors_fatal 11000 1726867164.66716: done checking for any_errors_fatal 11000 1726867164.66717: checking for max_fail_percentage 11000 1726867164.66718: done checking for max_fail_percentage 11000 1726867164.66719: checking to see if all hosts have failed and the running result is not ok 11000 1726867164.66720: done checking to see if all hosts have failed 11000 1726867164.66721: getting the remaining hosts for this loop 11000 1726867164.66722: done getting the remaining hosts for this loop 11000 1726867164.66725: getting the next task for host managed_node1 11000 1726867164.66732: done getting next task for host managed_node1 11000 1726867164.66736: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11000 1726867164.66740: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867164.66750: getting variables 11000 1726867164.66751: in VariableManager get_vars() 11000 1726867164.66804: Calling all_inventory to load vars for managed_node1 11000 1726867164.66807: Calling groups_inventory to load vars for managed_node1 11000 1726867164.66810: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867164.66818: Calling all_plugins_play to load vars for managed_node1 11000 1726867164.66821: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867164.66824: Calling groups_plugins_play to load vars for managed_node1 11000 1726867164.68670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867164.70420: done with get_vars() 11000 1726867164.70444: done getting variables 11000 1726867164.70726: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 17:19:24 -0400 (0:00:00.821) 0:00:26.350 ****** 11000 1726867164.70767: entering _queue_task() for managed_node1/debug 11000 1726867164.71452: worker is 1 (out of 1 available) 11000 1726867164.71464: exiting _queue_task() for managed_node1/debug 11000 1726867164.71475: done queuing things up, now waiting for results queue to drain 11000 1726867164.71476: waiting for pending results... 11000 1726867164.71922: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11000 1726867164.71935: in run() - task 0affcac9-a3a5-c734-026a-00000000007e 11000 1726867164.71957: variable 'ansible_search_path' from source: unknown 11000 1726867164.71966: variable 'ansible_search_path' from source: unknown 11000 1726867164.72015: calling self._execute() 11000 1726867164.72120: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867164.72241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867164.72244: variable 'omit' from source: magic vars 11000 1726867164.72539: variable 'ansible_distribution_major_version' from source: facts 11000 1726867164.72556: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867164.72574: variable 'omit' from source: magic vars 11000 1726867164.72643: variable 'omit' from source: magic vars 11000 1726867164.72749: variable 'network_provider' from source: set_fact 11000 1726867164.72773: variable 'omit' from source: magic vars 11000 1726867164.72827: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867164.72896: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867164.72900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867164.72920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867164.72938: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867164.72970: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867164.73003: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867164.73006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867164.73096: Set connection var ansible_shell_type to sh 11000 1726867164.73116: Set connection var ansible_pipelining to False 11000 1726867164.73182: Set connection var ansible_shell_executable to /bin/sh 11000 1726867164.73186: Set connection var ansible_connection to ssh 11000 1726867164.73191: Set connection var ansible_timeout to 10 11000 1726867164.73194: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867164.73196: variable 'ansible_shell_executable' from source: unknown 11000 1726867164.73199: variable 'ansible_connection' from source: unknown 11000 1726867164.73201: variable 'ansible_module_compression' from source: unknown 11000 1726867164.73203: variable 'ansible_shell_type' from source: unknown 11000 1726867164.73205: variable 'ansible_shell_executable' from source: unknown 11000 1726867164.73210: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867164.73225: variable 'ansible_pipelining' from source: unknown 11000 1726867164.73233: variable 'ansible_timeout' from source: unknown 11000 1726867164.73241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867164.73393: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867164.73437: variable 'omit' from source: magic vars 11000 1726867164.73440: starting attempt loop 11000 1726867164.73443: running the handler 11000 1726867164.73475: handler run complete 11000 1726867164.73500: attempt loop complete, returning result 11000 1726867164.73546: _execute() done 11000 1726867164.73549: dumping result to json 11000 1726867164.73552: done dumping result, returning 11000 1726867164.73554: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [0affcac9-a3a5-c734-026a-00000000007e] 11000 1726867164.73556: sending task result for task 0affcac9-a3a5-c734-026a-00000000007e ok: [managed_node1] => {} MSG: Using network provider: nm 11000 1726867164.73741: no more pending results, returning what we have 11000 1726867164.73745: results queue empty 11000 1726867164.73746: checking for any_errors_fatal 11000 1726867164.73761: done checking for any_errors_fatal 11000 1726867164.73762: checking for max_fail_percentage 11000 1726867164.73764: done checking for max_fail_percentage 11000 1726867164.73765: checking to see if all hosts have failed and the running result is not ok 11000 1726867164.73766: done checking to see if all hosts have failed 11000 1726867164.73766: getting the remaining hosts for this loop 11000 1726867164.73768: done getting the remaining hosts for this loop 11000 1726867164.73772: getting the next task for host managed_node1 11000 1726867164.73780: done getting next task for host managed_node1 11000 1726867164.73784: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11000 1726867164.73791: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867164.73804: getting variables 11000 1726867164.73806: in VariableManager get_vars() 11000 1726867164.73845: Calling all_inventory to load vars for managed_node1 11000 1726867164.73849: Calling groups_inventory to load vars for managed_node1 11000 1726867164.73851: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867164.73862: Calling all_plugins_play to load vars for managed_node1 11000 1726867164.74085: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867164.74093: Calling groups_plugins_play to load vars for managed_node1 11000 1726867164.74703: done sending task result for task 0affcac9-a3a5-c734-026a-00000000007e 11000 1726867164.74707: WORKER PROCESS EXITING 11000 1726867164.79542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867164.80387: done with get_vars() 11000 1726867164.80405: done getting variables 11000 1726867164.80440: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 17:19:24 -0400 (0:00:00.096) 0:00:26.447 ****** 11000 1726867164.80460: entering _queue_task() for managed_node1/fail 11000 1726867164.80735: worker is 1 (out of 1 available) 11000 1726867164.80747: exiting _queue_task() for managed_node1/fail 11000 1726867164.80759: done queuing things up, now waiting for results queue to drain 11000 1726867164.80761: waiting for pending results... 11000 1726867164.80996: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11000 1726867164.81159: in run() - task 0affcac9-a3a5-c734-026a-00000000007f 11000 1726867164.81180: variable 'ansible_search_path' from source: unknown 11000 1726867164.81189: variable 'ansible_search_path' from source: unknown 11000 1726867164.81226: calling self._execute() 11000 1726867164.81335: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867164.81346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867164.81360: variable 'omit' from source: magic vars 11000 1726867164.81738: variable 'ansible_distribution_major_version' from source: facts 11000 1726867164.81755: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867164.81882: variable 'network_state' from source: role '' defaults 11000 1726867164.81897: Evaluated conditional (network_state != {}): False 11000 1726867164.81904: when evaluation is False, skipping this task 11000 1726867164.81910: _execute() done 11000 1726867164.81918: dumping result to json 11000 1726867164.81923: done dumping result, returning 11000 1726867164.81932: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcac9-a3a5-c734-026a-00000000007f] 11000 1726867164.81939: sending task result for task 0affcac9-a3a5-c734-026a-00000000007f 11000 1726867164.82052: done sending task result for task 0affcac9-a3a5-c734-026a-00000000007f 11000 1726867164.82059: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867164.82113: no more pending results, returning what we have 11000 1726867164.82116: results queue empty 11000 1726867164.82117: checking for any_errors_fatal 11000 1726867164.82124: done checking for any_errors_fatal 11000 1726867164.82125: checking for max_fail_percentage 11000 1726867164.82127: done checking for max_fail_percentage 11000 1726867164.82127: checking to see if all hosts have failed and the running result is not ok 11000 1726867164.82128: done checking to see if all hosts have failed 11000 1726867164.82129: getting the remaining hosts for this loop 11000 1726867164.82130: done getting the remaining hosts for this loop 11000 1726867164.82133: getting the next task for host managed_node1 11000 1726867164.82139: done getting next task for host managed_node1 11000 1726867164.82142: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11000 1726867164.82145: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867164.82165: getting variables 11000 1726867164.82166: in VariableManager get_vars() 11000 1726867164.82207: Calling all_inventory to load vars for managed_node1 11000 1726867164.82210: Calling groups_inventory to load vars for managed_node1 11000 1726867164.82212: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867164.82222: Calling all_plugins_play to load vars for managed_node1 11000 1726867164.82225: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867164.82227: Calling groups_plugins_play to load vars for managed_node1 11000 1726867164.83385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867164.84387: done with get_vars() 11000 1726867164.84403: done getting variables 11000 1726867164.84442: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 17:19:24 -0400 (0:00:00.040) 0:00:26.487 ****** 11000 1726867164.84466: entering _queue_task() for managed_node1/fail 11000 1726867164.84668: worker is 1 (out of 1 available) 11000 1726867164.84682: exiting _queue_task() for managed_node1/fail 11000 1726867164.84692: done queuing things up, now waiting for results queue to drain 11000 1726867164.84694: waiting for pending results... 11000 1726867164.84868: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11000 1726867164.84962: in run() - task 0affcac9-a3a5-c734-026a-000000000080 11000 1726867164.84974: variable 'ansible_search_path' from source: unknown 11000 1726867164.84978: variable 'ansible_search_path' from source: unknown 11000 1726867164.85009: calling self._execute() 11000 1726867164.85086: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867164.85093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867164.85102: variable 'omit' from source: magic vars 11000 1726867164.85378: variable 'ansible_distribution_major_version' from source: facts 11000 1726867164.85387: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867164.85497: variable 'network_state' from source: role '' defaults 11000 1726867164.85501: Evaluated conditional (network_state != {}): False 11000 1726867164.85504: when evaluation is False, skipping this task 11000 1726867164.85507: _execute() done 11000 1726867164.85509: dumping result to json 11000 1726867164.85512: done dumping result, returning 11000 1726867164.85515: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcac9-a3a5-c734-026a-000000000080] 11000 1726867164.85517: sending task result for task 0affcac9-a3a5-c734-026a-000000000080 11000 1726867164.85628: done sending task result for task 0affcac9-a3a5-c734-026a-000000000080 11000 1726867164.85632: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867164.85673: no more pending results, returning what we have 11000 1726867164.85679: results queue empty 11000 1726867164.85680: checking for any_errors_fatal 11000 1726867164.85686: done checking for any_errors_fatal 11000 1726867164.85687: checking for max_fail_percentage 11000 1726867164.85689: done checking for max_fail_percentage 11000 1726867164.85690: checking to see if all hosts have failed and the running result is not ok 11000 1726867164.85691: done checking to see if all hosts have failed 11000 1726867164.85691: getting the remaining hosts for this loop 11000 1726867164.85693: done getting the remaining hosts for this loop 11000 1726867164.85697: getting the next task for host managed_node1 11000 1726867164.85704: done getting next task for host managed_node1 11000 1726867164.85708: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11000 1726867164.85711: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867164.85728: getting variables 11000 1726867164.85729: in VariableManager get_vars() 11000 1726867164.85763: Calling all_inventory to load vars for managed_node1 11000 1726867164.85766: Calling groups_inventory to load vars for managed_node1 11000 1726867164.85768: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867164.85797: Calling all_plugins_play to load vars for managed_node1 11000 1726867164.85802: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867164.85805: Calling groups_plugins_play to load vars for managed_node1 11000 1726867164.87013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867164.87890: done with get_vars() 11000 1726867164.87903: done getting variables 11000 1726867164.87940: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 17:19:24 -0400 (0:00:00.034) 0:00:26.522 ****** 11000 1726867164.87963: entering _queue_task() for managed_node1/fail 11000 1726867164.88158: worker is 1 (out of 1 available) 11000 1726867164.88169: exiting _queue_task() for managed_node1/fail 11000 1726867164.88183: done queuing things up, now waiting for results queue to drain 11000 1726867164.88185: waiting for pending results... 11000 1726867164.88352: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11000 1726867164.88438: in run() - task 0affcac9-a3a5-c734-026a-000000000081 11000 1726867164.88449: variable 'ansible_search_path' from source: unknown 11000 1726867164.88452: variable 'ansible_search_path' from source: unknown 11000 1726867164.88480: calling self._execute() 11000 1726867164.88560: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867164.88564: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867164.88573: variable 'omit' from source: magic vars 11000 1726867164.88848: variable 'ansible_distribution_major_version' from source: facts 11000 1726867164.88854: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867164.88971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867164.90870: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867164.90924: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867164.90950: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867164.90975: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867164.91003: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867164.91060: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867164.91081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867164.91104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867164.91130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867164.91141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867164.91209: variable 'ansible_distribution_major_version' from source: facts 11000 1726867164.91219: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11000 1726867164.91294: variable 'ansible_distribution' from source: facts 11000 1726867164.91297: variable '__network_rh_distros' from source: role '' defaults 11000 1726867164.91303: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11000 1726867164.91459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867164.91476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867164.91496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867164.91521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867164.91533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867164.91567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867164.91584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867164.91603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867164.91627: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867164.91640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867164.91670: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867164.91690: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867164.91705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867164.91729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867164.91739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867164.91939: variable 'network_connections' from source: task vars 11000 1726867164.91947: variable 'port2_profile' from source: play vars 11000 1726867164.91996: variable 'port2_profile' from source: play vars 11000 1726867164.92004: variable 'port1_profile' from source: play vars 11000 1726867164.92045: variable 'port1_profile' from source: play vars 11000 1726867164.92053: variable 'controller_profile' from source: play vars 11000 1726867164.92097: variable 'controller_profile' from source: play vars 11000 1726867164.92104: variable 'network_state' from source: role '' defaults 11000 1726867164.92148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867164.92263: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867164.92296: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867164.92317: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867164.92338: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867164.92369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867164.92387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867164.92410: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867164.92428: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867164.92453: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11000 1726867164.92457: when evaluation is False, skipping this task 11000 1726867164.92459: _execute() done 11000 1726867164.92462: dumping result to json 11000 1726867164.92464: done dumping result, returning 11000 1726867164.92470: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcac9-a3a5-c734-026a-000000000081] 11000 1726867164.92474: sending task result for task 0affcac9-a3a5-c734-026a-000000000081 11000 1726867164.92560: done sending task result for task 0affcac9-a3a5-c734-026a-000000000081 11000 1726867164.92563: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11000 1726867164.92611: no more pending results, returning what we have 11000 1726867164.92613: results queue empty 11000 1726867164.92614: checking for any_errors_fatal 11000 1726867164.92621: done checking for any_errors_fatal 11000 1726867164.92622: checking for max_fail_percentage 11000 1726867164.92624: done checking for max_fail_percentage 11000 1726867164.92625: checking to see if all hosts have failed and the running result is not ok 11000 1726867164.92626: done checking to see if all hosts have failed 11000 1726867164.92626: getting the remaining hosts for this loop 11000 1726867164.92628: done getting the remaining hosts for this loop 11000 1726867164.92631: getting the next task for host managed_node1 11000 1726867164.92637: done getting next task for host managed_node1 11000 1726867164.92640: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11000 1726867164.92643: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867164.92660: getting variables 11000 1726867164.92661: in VariableManager get_vars() 11000 1726867164.92702: Calling all_inventory to load vars for managed_node1 11000 1726867164.92704: Calling groups_inventory to load vars for managed_node1 11000 1726867164.92706: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867164.92715: Calling all_plugins_play to load vars for managed_node1 11000 1726867164.92717: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867164.92720: Calling groups_plugins_play to load vars for managed_node1 11000 1726867164.94070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867164.95721: done with get_vars() 11000 1726867164.95750: done getting variables 11000 1726867164.95818: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 17:19:24 -0400 (0:00:00.078) 0:00:26.601 ****** 11000 1726867164.95858: entering _queue_task() for managed_node1/dnf 11000 1726867164.96225: worker is 1 (out of 1 available) 11000 1726867164.96237: exiting _queue_task() for managed_node1/dnf 11000 1726867164.96251: done queuing things up, now waiting for results queue to drain 11000 1726867164.96252: waiting for pending results... 11000 1726867164.96695: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11000 1726867164.96753: in run() - task 0affcac9-a3a5-c734-026a-000000000082 11000 1726867164.96772: variable 'ansible_search_path' from source: unknown 11000 1726867164.96883: variable 'ansible_search_path' from source: unknown 11000 1726867164.96887: calling self._execute() 11000 1726867164.96922: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867164.96935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867164.96951: variable 'omit' from source: magic vars 11000 1726867164.97301: variable 'ansible_distribution_major_version' from source: facts 11000 1726867164.97320: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867164.97567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867165.00184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867165.00266: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867165.00319: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867165.00361: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867165.00399: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867165.00491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.00535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.00567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.00619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.00750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.00771: variable 'ansible_distribution' from source: facts 11000 1726867165.00783: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.00806: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11000 1726867165.00928: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867165.01069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.01109: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.01138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.01191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.01297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.01300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.01303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.01319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.01361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.01384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.01437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.01467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.01501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.01551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.01571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.01746: variable 'network_connections' from source: task vars 11000 1726867165.01763: variable 'port2_profile' from source: play vars 11000 1726867165.01837: variable 'port2_profile' from source: play vars 11000 1726867165.01856: variable 'port1_profile' from source: play vars 11000 1726867165.01951: variable 'port1_profile' from source: play vars 11000 1726867165.01954: variable 'controller_profile' from source: play vars 11000 1726867165.02000: variable 'controller_profile' from source: play vars 11000 1726867165.02085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867165.02258: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867165.02318: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867165.02386: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867165.02392: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867165.02434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867165.02470: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867165.02582: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.02586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867165.02595: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867165.02848: variable 'network_connections' from source: task vars 11000 1726867165.02858: variable 'port2_profile' from source: play vars 11000 1726867165.02925: variable 'port2_profile' from source: play vars 11000 1726867165.02943: variable 'port1_profile' from source: play vars 11000 1726867165.03007: variable 'port1_profile' from source: play vars 11000 1726867165.03019: variable 'controller_profile' from source: play vars 11000 1726867165.03090: variable 'controller_profile' from source: play vars 11000 1726867165.03118: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11000 1726867165.03152: when evaluation is False, skipping this task 11000 1726867165.03156: _execute() done 11000 1726867165.03158: dumping result to json 11000 1726867165.03161: done dumping result, returning 11000 1726867165.03163: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-000000000082] 11000 1726867165.03169: sending task result for task 0affcac9-a3a5-c734-026a-000000000082 11000 1726867165.03335: done sending task result for task 0affcac9-a3a5-c734-026a-000000000082 11000 1726867165.03339: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11000 1726867165.03396: no more pending results, returning what we have 11000 1726867165.03399: results queue empty 11000 1726867165.03400: checking for any_errors_fatal 11000 1726867165.03406: done checking for any_errors_fatal 11000 1726867165.03407: checking for max_fail_percentage 11000 1726867165.03409: done checking for max_fail_percentage 11000 1726867165.03410: checking to see if all hosts have failed and the running result is not ok 11000 1726867165.03411: done checking to see if all hosts have failed 11000 1726867165.03411: getting the remaining hosts for this loop 11000 1726867165.03413: done getting the remaining hosts for this loop 11000 1726867165.03416: getting the next task for host managed_node1 11000 1726867165.03424: done getting next task for host managed_node1 11000 1726867165.03427: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11000 1726867165.03431: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867165.03449: getting variables 11000 1726867165.03451: in VariableManager get_vars() 11000 1726867165.03495: Calling all_inventory to load vars for managed_node1 11000 1726867165.03498: Calling groups_inventory to load vars for managed_node1 11000 1726867165.03500: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867165.03511: Calling all_plugins_play to load vars for managed_node1 11000 1726867165.03513: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867165.03516: Calling groups_plugins_play to load vars for managed_node1 11000 1726867165.05275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867165.06926: done with get_vars() 11000 1726867165.06951: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11000 1726867165.07032: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 17:19:25 -0400 (0:00:00.112) 0:00:26.713 ****** 11000 1726867165.07071: entering _queue_task() for managed_node1/yum 11000 1726867165.07495: worker is 1 (out of 1 available) 11000 1726867165.07506: exiting _queue_task() for managed_node1/yum 11000 1726867165.07518: done queuing things up, now waiting for results queue to drain 11000 1726867165.07519: waiting for pending results... 11000 1726867165.07822: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11000 1726867165.07883: in run() - task 0affcac9-a3a5-c734-026a-000000000083 11000 1726867165.07922: variable 'ansible_search_path' from source: unknown 11000 1726867165.07925: variable 'ansible_search_path' from source: unknown 11000 1726867165.07983: calling self._execute() 11000 1726867165.08076: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.08139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.08144: variable 'omit' from source: magic vars 11000 1726867165.08508: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.08523: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867165.08798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867165.10930: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867165.11017: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867165.11055: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867165.11102: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867165.11133: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867165.11223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.11258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.11294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.11345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.11366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.11481: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.11493: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11000 1726867165.11497: when evaluation is False, skipping this task 11000 1726867165.11500: _execute() done 11000 1726867165.11502: dumping result to json 11000 1726867165.11507: done dumping result, returning 11000 1726867165.11514: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-000000000083] 11000 1726867165.11521: sending task result for task 0affcac9-a3a5-c734-026a-000000000083 11000 1726867165.11621: done sending task result for task 0affcac9-a3a5-c734-026a-000000000083 11000 1726867165.11624: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11000 1726867165.11681: no more pending results, returning what we have 11000 1726867165.11684: results queue empty 11000 1726867165.11685: checking for any_errors_fatal 11000 1726867165.11694: done checking for any_errors_fatal 11000 1726867165.11695: checking for max_fail_percentage 11000 1726867165.11696: done checking for max_fail_percentage 11000 1726867165.11697: checking to see if all hosts have failed and the running result is not ok 11000 1726867165.11698: done checking to see if all hosts have failed 11000 1726867165.11699: getting the remaining hosts for this loop 11000 1726867165.11701: done getting the remaining hosts for this loop 11000 1726867165.11704: getting the next task for host managed_node1 11000 1726867165.11710: done getting next task for host managed_node1 11000 1726867165.11714: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11000 1726867165.11718: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867165.11736: getting variables 11000 1726867165.11738: in VariableManager get_vars() 11000 1726867165.11771: Calling all_inventory to load vars for managed_node1 11000 1726867165.11773: Calling groups_inventory to load vars for managed_node1 11000 1726867165.11775: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867165.11785: Calling all_plugins_play to load vars for managed_node1 11000 1726867165.11789: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867165.11792: Calling groups_plugins_play to load vars for managed_node1 11000 1726867165.12546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867165.13732: done with get_vars() 11000 1726867165.13751: done getting variables 11000 1726867165.13808: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 17:19:25 -0400 (0:00:00.067) 0:00:26.781 ****** 11000 1726867165.13841: entering _queue_task() for managed_node1/fail 11000 1726867165.14120: worker is 1 (out of 1 available) 11000 1726867165.14132: exiting _queue_task() for managed_node1/fail 11000 1726867165.14144: done queuing things up, now waiting for results queue to drain 11000 1726867165.14146: waiting for pending results... 11000 1726867165.14422: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11000 1726867165.14515: in run() - task 0affcac9-a3a5-c734-026a-000000000084 11000 1726867165.14519: variable 'ansible_search_path' from source: unknown 11000 1726867165.14523: variable 'ansible_search_path' from source: unknown 11000 1726867165.14550: calling self._execute() 11000 1726867165.14627: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.14631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.14638: variable 'omit' from source: magic vars 11000 1726867165.14901: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.14910: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867165.14993: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867165.15119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867165.16783: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867165.16786: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867165.16789: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867165.16797: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867165.16829: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867165.16911: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.16947: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.16981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.17028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.17049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.17104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.17133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.17163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.17219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.17241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.17287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.17318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.17347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.17391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.17410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.17583: variable 'network_connections' from source: task vars 11000 1726867165.17601: variable 'port2_profile' from source: play vars 11000 1726867165.17668: variable 'port2_profile' from source: play vars 11000 1726867165.17686: variable 'port1_profile' from source: play vars 11000 1726867165.17752: variable 'port1_profile' from source: play vars 11000 1726867165.17765: variable 'controller_profile' from source: play vars 11000 1726867165.17982: variable 'controller_profile' from source: play vars 11000 1726867165.17986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867165.18072: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867165.18116: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867165.18151: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867165.18185: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867165.18229: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867165.18255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867165.18290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.18322: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867165.18376: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867165.18610: variable 'network_connections' from source: task vars 11000 1726867165.18621: variable 'port2_profile' from source: play vars 11000 1726867165.18684: variable 'port2_profile' from source: play vars 11000 1726867165.18699: variable 'port1_profile' from source: play vars 11000 1726867165.18761: variable 'port1_profile' from source: play vars 11000 1726867165.18775: variable 'controller_profile' from source: play vars 11000 1726867165.18838: variable 'controller_profile' from source: play vars 11000 1726867165.18867: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11000 1726867165.18888: when evaluation is False, skipping this task 11000 1726867165.18896: _execute() done 11000 1726867165.18904: dumping result to json 11000 1726867165.18912: done dumping result, returning 11000 1726867165.18923: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-000000000084] 11000 1726867165.18932: sending task result for task 0affcac9-a3a5-c734-026a-000000000084 skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11000 1726867165.19082: no more pending results, returning what we have 11000 1726867165.19086: results queue empty 11000 1726867165.19087: checking for any_errors_fatal 11000 1726867165.19095: done checking for any_errors_fatal 11000 1726867165.19095: checking for max_fail_percentage 11000 1726867165.19097: done checking for max_fail_percentage 11000 1726867165.19098: checking to see if all hosts have failed and the running result is not ok 11000 1726867165.19098: done checking to see if all hosts have failed 11000 1726867165.19099: getting the remaining hosts for this loop 11000 1726867165.19100: done getting the remaining hosts for this loop 11000 1726867165.19103: getting the next task for host managed_node1 11000 1726867165.19109: done getting next task for host managed_node1 11000 1726867165.19113: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11000 1726867165.19228: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867165.19248: getting variables 11000 1726867165.19249: in VariableManager get_vars() 11000 1726867165.19285: Calling all_inventory to load vars for managed_node1 11000 1726867165.19290: Calling groups_inventory to load vars for managed_node1 11000 1726867165.19292: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867165.19301: Calling all_plugins_play to load vars for managed_node1 11000 1726867165.19303: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867165.19306: Calling groups_plugins_play to load vars for managed_node1 11000 1726867165.19824: done sending task result for task 0affcac9-a3a5-c734-026a-000000000084 11000 1726867165.19827: WORKER PROCESS EXITING 11000 1726867165.20637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867165.22175: done with get_vars() 11000 1726867165.22200: done getting variables 11000 1726867165.22260: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 17:19:25 -0400 (0:00:00.084) 0:00:26.866 ****** 11000 1726867165.22299: entering _queue_task() for managed_node1/package 11000 1726867165.22640: worker is 1 (out of 1 available) 11000 1726867165.22653: exiting _queue_task() for managed_node1/package 11000 1726867165.22665: done queuing things up, now waiting for results queue to drain 11000 1726867165.22666: waiting for pending results... 11000 1726867165.23093: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11000 1726867165.23097: in run() - task 0affcac9-a3a5-c734-026a-000000000085 11000 1726867165.23105: variable 'ansible_search_path' from source: unknown 11000 1726867165.23114: variable 'ansible_search_path' from source: unknown 11000 1726867165.23154: calling self._execute() 11000 1726867165.23269: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.23285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.23300: variable 'omit' from source: magic vars 11000 1726867165.23681: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.23699: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867165.23906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867165.24172: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867165.24226: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867165.24268: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867165.24355: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867165.24517: variable 'network_packages' from source: role '' defaults 11000 1726867165.24593: variable '__network_provider_setup' from source: role '' defaults 11000 1726867165.24609: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867165.24681: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867165.24696: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867165.24763: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867165.24956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867165.27484: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867165.27488: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867165.27491: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867165.27493: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867165.27495: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867165.27511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.27545: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.27579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.27629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.27650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.27700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.27733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.27762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.27807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.27830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.28061: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11000 1726867165.28199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.28229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.28265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.28309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.28327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.28420: variable 'ansible_python' from source: facts 11000 1726867165.28452: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11000 1726867165.28543: variable '__network_wpa_supplicant_required' from source: role '' defaults 11000 1726867165.28626: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11000 1726867165.28756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.28788: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.28823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.28867: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.28910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.28945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.29018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.29021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.29059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.29082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.29243: variable 'network_connections' from source: task vars 11000 1726867165.29345: variable 'port2_profile' from source: play vars 11000 1726867165.29358: variable 'port2_profile' from source: play vars 11000 1726867165.29373: variable 'port1_profile' from source: play vars 11000 1726867165.29479: variable 'port1_profile' from source: play vars 11000 1726867165.29496: variable 'controller_profile' from source: play vars 11000 1726867165.29600: variable 'controller_profile' from source: play vars 11000 1726867165.29679: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867165.29713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867165.29748: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.29794: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867165.29848: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867165.30129: variable 'network_connections' from source: task vars 11000 1726867165.30139: variable 'port2_profile' from source: play vars 11000 1726867165.30241: variable 'port2_profile' from source: play vars 11000 1726867165.30258: variable 'port1_profile' from source: play vars 11000 1726867165.30359: variable 'port1_profile' from source: play vars 11000 1726867165.30373: variable 'controller_profile' from source: play vars 11000 1726867165.30544: variable 'controller_profile' from source: play vars 11000 1726867165.30548: variable '__network_packages_default_wireless' from source: role '' defaults 11000 1726867165.30605: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867165.31171: variable 'network_connections' from source: task vars 11000 1726867165.31184: variable 'port2_profile' from source: play vars 11000 1726867165.31252: variable 'port2_profile' from source: play vars 11000 1726867165.31265: variable 'port1_profile' from source: play vars 11000 1726867165.31335: variable 'port1_profile' from source: play vars 11000 1726867165.31347: variable 'controller_profile' from source: play vars 11000 1726867165.31417: variable 'controller_profile' from source: play vars 11000 1726867165.31446: variable '__network_packages_default_team' from source: role '' defaults 11000 1726867165.31530: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867165.31836: variable 'network_connections' from source: task vars 11000 1726867165.31960: variable 'port2_profile' from source: play vars 11000 1726867165.31963: variable 'port2_profile' from source: play vars 11000 1726867165.31965: variable 'port1_profile' from source: play vars 11000 1726867165.31990: variable 'port1_profile' from source: play vars 11000 1726867165.32002: variable 'controller_profile' from source: play vars 11000 1726867165.32064: variable 'controller_profile' from source: play vars 11000 1726867165.32123: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867165.32190: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867165.32203: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867165.32262: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867165.32501: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11000 1726867165.33264: variable 'network_connections' from source: task vars 11000 1726867165.33267: variable 'port2_profile' from source: play vars 11000 1726867165.33280: variable 'port2_profile' from source: play vars 11000 1726867165.33293: variable 'port1_profile' from source: play vars 11000 1726867165.33352: variable 'port1_profile' from source: play vars 11000 1726867165.33493: variable 'controller_profile' from source: play vars 11000 1726867165.33553: variable 'controller_profile' from source: play vars 11000 1726867165.33566: variable 'ansible_distribution' from source: facts 11000 1726867165.33575: variable '__network_rh_distros' from source: role '' defaults 11000 1726867165.33588: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.33617: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11000 1726867165.33936: variable 'ansible_distribution' from source: facts 11000 1726867165.33997: variable '__network_rh_distros' from source: role '' defaults 11000 1726867165.34006: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.34023: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11000 1726867165.34328: variable 'ansible_distribution' from source: facts 11000 1726867165.34531: variable '__network_rh_distros' from source: role '' defaults 11000 1726867165.34534: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.34537: variable 'network_provider' from source: set_fact 11000 1726867165.34539: variable 'ansible_facts' from source: unknown 11000 1726867165.35770: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11000 1726867165.35774: when evaluation is False, skipping this task 11000 1726867165.35776: _execute() done 11000 1726867165.35781: dumping result to json 11000 1726867165.35783: done dumping result, returning 11000 1726867165.35795: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [0affcac9-a3a5-c734-026a-000000000085] 11000 1726867165.35800: sending task result for task 0affcac9-a3a5-c734-026a-000000000085 11000 1726867165.35903: done sending task result for task 0affcac9-a3a5-c734-026a-000000000085 11000 1726867165.35906: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11000 1726867165.35980: no more pending results, returning what we have 11000 1726867165.35984: results queue empty 11000 1726867165.35985: checking for any_errors_fatal 11000 1726867165.35992: done checking for any_errors_fatal 11000 1726867165.35993: checking for max_fail_percentage 11000 1726867165.35994: done checking for max_fail_percentage 11000 1726867165.35995: checking to see if all hosts have failed and the running result is not ok 11000 1726867165.35996: done checking to see if all hosts have failed 11000 1726867165.35997: getting the remaining hosts for this loop 11000 1726867165.35998: done getting the remaining hosts for this loop 11000 1726867165.36006: getting the next task for host managed_node1 11000 1726867165.36013: done getting next task for host managed_node1 11000 1726867165.36016: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11000 1726867165.36020: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867165.36037: getting variables 11000 1726867165.36039: in VariableManager get_vars() 11000 1726867165.36076: Calling all_inventory to load vars for managed_node1 11000 1726867165.36081: Calling groups_inventory to load vars for managed_node1 11000 1726867165.36083: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867165.36093: Calling all_plugins_play to load vars for managed_node1 11000 1726867165.36095: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867165.36098: Calling groups_plugins_play to load vars for managed_node1 11000 1726867165.38243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867165.39627: done with get_vars() 11000 1726867165.39648: done getting variables 11000 1726867165.39711: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 17:19:25 -0400 (0:00:00.174) 0:00:27.040 ****** 11000 1726867165.39751: entering _queue_task() for managed_node1/package 11000 1726867165.40094: worker is 1 (out of 1 available) 11000 1726867165.40107: exiting _queue_task() for managed_node1/package 11000 1726867165.40126: done queuing things up, now waiting for results queue to drain 11000 1726867165.40128: waiting for pending results... 11000 1726867165.40494: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11000 1726867165.40499: in run() - task 0affcac9-a3a5-c734-026a-000000000086 11000 1726867165.40512: variable 'ansible_search_path' from source: unknown 11000 1726867165.40519: variable 'ansible_search_path' from source: unknown 11000 1726867165.40554: calling self._execute() 11000 1726867165.40663: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.40675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.40691: variable 'omit' from source: magic vars 11000 1726867165.41070: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.41088: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867165.41212: variable 'network_state' from source: role '' defaults 11000 1726867165.41226: Evaluated conditional (network_state != {}): False 11000 1726867165.41234: when evaluation is False, skipping this task 11000 1726867165.41241: _execute() done 11000 1726867165.41248: dumping result to json 11000 1726867165.41254: done dumping result, returning 11000 1726867165.41268: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcac9-a3a5-c734-026a-000000000086] 11000 1726867165.41279: sending task result for task 0affcac9-a3a5-c734-026a-000000000086 11000 1726867165.41494: done sending task result for task 0affcac9-a3a5-c734-026a-000000000086 11000 1726867165.41497: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867165.41550: no more pending results, returning what we have 11000 1726867165.41554: results queue empty 11000 1726867165.41555: checking for any_errors_fatal 11000 1726867165.41563: done checking for any_errors_fatal 11000 1726867165.41563: checking for max_fail_percentage 11000 1726867165.41565: done checking for max_fail_percentage 11000 1726867165.41566: checking to see if all hosts have failed and the running result is not ok 11000 1726867165.41566: done checking to see if all hosts have failed 11000 1726867165.41567: getting the remaining hosts for this loop 11000 1726867165.41569: done getting the remaining hosts for this loop 11000 1726867165.41572: getting the next task for host managed_node1 11000 1726867165.41580: done getting next task for host managed_node1 11000 1726867165.41584: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11000 1726867165.41588: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867165.41609: getting variables 11000 1726867165.41611: in VariableManager get_vars() 11000 1726867165.41650: Calling all_inventory to load vars for managed_node1 11000 1726867165.41652: Calling groups_inventory to load vars for managed_node1 11000 1726867165.41655: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867165.41667: Calling all_plugins_play to load vars for managed_node1 11000 1726867165.41670: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867165.41673: Calling groups_plugins_play to load vars for managed_node1 11000 1726867165.43613: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867165.45165: done with get_vars() 11000 1726867165.45186: done getting variables 11000 1726867165.45239: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 17:19:25 -0400 (0:00:00.055) 0:00:27.095 ****** 11000 1726867165.45272: entering _queue_task() for managed_node1/package 11000 1726867165.45522: worker is 1 (out of 1 available) 11000 1726867165.45533: exiting _queue_task() for managed_node1/package 11000 1726867165.45544: done queuing things up, now waiting for results queue to drain 11000 1726867165.45545: waiting for pending results... 11000 1726867165.45809: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11000 1726867165.45954: in run() - task 0affcac9-a3a5-c734-026a-000000000087 11000 1726867165.46001: variable 'ansible_search_path' from source: unknown 11000 1726867165.46004: variable 'ansible_search_path' from source: unknown 11000 1726867165.46022: calling self._execute() 11000 1726867165.46122: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.46133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.46182: variable 'omit' from source: magic vars 11000 1726867165.46495: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.46515: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867165.46636: variable 'network_state' from source: role '' defaults 11000 1726867165.46654: Evaluated conditional (network_state != {}): False 11000 1726867165.46661: when evaluation is False, skipping this task 11000 1726867165.46668: _execute() done 11000 1726867165.46757: dumping result to json 11000 1726867165.46760: done dumping result, returning 11000 1726867165.46762: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcac9-a3a5-c734-026a-000000000087] 11000 1726867165.46764: sending task result for task 0affcac9-a3a5-c734-026a-000000000087 11000 1726867165.46826: done sending task result for task 0affcac9-a3a5-c734-026a-000000000087 11000 1726867165.46829: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867165.46905: no more pending results, returning what we have 11000 1726867165.46910: results queue empty 11000 1726867165.46911: checking for any_errors_fatal 11000 1726867165.46918: done checking for any_errors_fatal 11000 1726867165.46918: checking for max_fail_percentage 11000 1726867165.46920: done checking for max_fail_percentage 11000 1726867165.46921: checking to see if all hosts have failed and the running result is not ok 11000 1726867165.46922: done checking to see if all hosts have failed 11000 1726867165.46922: getting the remaining hosts for this loop 11000 1726867165.46924: done getting the remaining hosts for this loop 11000 1726867165.46928: getting the next task for host managed_node1 11000 1726867165.46934: done getting next task for host managed_node1 11000 1726867165.46937: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11000 1726867165.46941: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867165.46960: getting variables 11000 1726867165.46961: in VariableManager get_vars() 11000 1726867165.47000: Calling all_inventory to load vars for managed_node1 11000 1726867165.47003: Calling groups_inventory to load vars for managed_node1 11000 1726867165.47005: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867165.47017: Calling all_plugins_play to load vars for managed_node1 11000 1726867165.47020: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867165.47024: Calling groups_plugins_play to load vars for managed_node1 11000 1726867165.48524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867165.50044: done with get_vars() 11000 1726867165.50064: done getting variables 11000 1726867165.50123: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 17:19:25 -0400 (0:00:00.048) 0:00:27.144 ****** 11000 1726867165.50157: entering _queue_task() for managed_node1/service 11000 1726867165.50401: worker is 1 (out of 1 available) 11000 1726867165.50413: exiting _queue_task() for managed_node1/service 11000 1726867165.50425: done queuing things up, now waiting for results queue to drain 11000 1726867165.50426: waiting for pending results... 11000 1726867165.51028: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11000 1726867165.51160: in run() - task 0affcac9-a3a5-c734-026a-000000000088 11000 1726867165.51202: variable 'ansible_search_path' from source: unknown 11000 1726867165.51241: variable 'ansible_search_path' from source: unknown 11000 1726867165.51344: calling self._execute() 11000 1726867165.51571: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.51587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.51607: variable 'omit' from source: magic vars 11000 1726867165.52322: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.52484: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867165.52684: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867165.53061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867165.55406: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867165.55486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867165.55521: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867165.55560: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867165.55606: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867165.55664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.55714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.55718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.55783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.55786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.55822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.55846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.55874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.55916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.56025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.56028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.56031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.56036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.56051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.56066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.56279: variable 'network_connections' from source: task vars 11000 1726867165.56282: variable 'port2_profile' from source: play vars 11000 1726867165.56318: variable 'port2_profile' from source: play vars 11000 1726867165.56330: variable 'port1_profile' from source: play vars 11000 1726867165.56391: variable 'port1_profile' from source: play vars 11000 1726867165.56395: variable 'controller_profile' from source: play vars 11000 1726867165.56461: variable 'controller_profile' from source: play vars 11000 1726867165.56533: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867165.56706: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867165.56743: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867165.56770: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867165.56799: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867165.57003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867165.57006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867165.57008: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.57011: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867165.57013: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867165.57359: variable 'network_connections' from source: task vars 11000 1726867165.57362: variable 'port2_profile' from source: play vars 11000 1726867165.57364: variable 'port2_profile' from source: play vars 11000 1726867165.57366: variable 'port1_profile' from source: play vars 11000 1726867165.57368: variable 'port1_profile' from source: play vars 11000 1726867165.57370: variable 'controller_profile' from source: play vars 11000 1726867165.57385: variable 'controller_profile' from source: play vars 11000 1726867165.57411: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11000 1726867165.57421: when evaluation is False, skipping this task 11000 1726867165.57423: _execute() done 11000 1726867165.57426: dumping result to json 11000 1726867165.57428: done dumping result, returning 11000 1726867165.57431: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcac9-a3a5-c734-026a-000000000088] 11000 1726867165.57433: sending task result for task 0affcac9-a3a5-c734-026a-000000000088 11000 1726867165.57531: done sending task result for task 0affcac9-a3a5-c734-026a-000000000088 11000 1726867165.57533: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11000 1726867165.57574: no more pending results, returning what we have 11000 1726867165.57579: results queue empty 11000 1726867165.57580: checking for any_errors_fatal 11000 1726867165.57586: done checking for any_errors_fatal 11000 1726867165.57586: checking for max_fail_percentage 11000 1726867165.57590: done checking for max_fail_percentage 11000 1726867165.57591: checking to see if all hosts have failed and the running result is not ok 11000 1726867165.57592: done checking to see if all hosts have failed 11000 1726867165.57593: getting the remaining hosts for this loop 11000 1726867165.57594: done getting the remaining hosts for this loop 11000 1726867165.57597: getting the next task for host managed_node1 11000 1726867165.57603: done getting next task for host managed_node1 11000 1726867165.57607: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11000 1726867165.57610: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867165.57627: getting variables 11000 1726867165.57629: in VariableManager get_vars() 11000 1726867165.57670: Calling all_inventory to load vars for managed_node1 11000 1726867165.57673: Calling groups_inventory to load vars for managed_node1 11000 1726867165.57675: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867165.57685: Calling all_plugins_play to load vars for managed_node1 11000 1726867165.57690: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867165.57692: Calling groups_plugins_play to load vars for managed_node1 11000 1726867165.58981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867165.60686: done with get_vars() 11000 1726867165.60705: done getting variables 11000 1726867165.60761: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 17:19:25 -0400 (0:00:00.106) 0:00:27.251 ****** 11000 1726867165.60795: entering _queue_task() for managed_node1/service 11000 1726867165.61065: worker is 1 (out of 1 available) 11000 1726867165.61080: exiting _queue_task() for managed_node1/service 11000 1726867165.61093: done queuing things up, now waiting for results queue to drain 11000 1726867165.61095: waiting for pending results... 11000 1726867165.61500: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11000 1726867165.61524: in run() - task 0affcac9-a3a5-c734-026a-000000000089 11000 1726867165.61544: variable 'ansible_search_path' from source: unknown 11000 1726867165.61552: variable 'ansible_search_path' from source: unknown 11000 1726867165.61597: calling self._execute() 11000 1726867165.61699: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.61713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.61729: variable 'omit' from source: magic vars 11000 1726867165.62101: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.62119: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867165.62282: variable 'network_provider' from source: set_fact 11000 1726867165.62294: variable 'network_state' from source: role '' defaults 11000 1726867165.62310: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11000 1726867165.62319: variable 'omit' from source: magic vars 11000 1726867165.62396: variable 'omit' from source: magic vars 11000 1726867165.62467: variable 'network_service_name' from source: role '' defaults 11000 1726867165.62499: variable 'network_service_name' from source: role '' defaults 11000 1726867165.62598: variable '__network_provider_setup' from source: role '' defaults 11000 1726867165.62609: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867165.62674: variable '__network_service_name_default_nm' from source: role '' defaults 11000 1726867165.62693: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867165.62757: variable '__network_packages_default_nm' from source: role '' defaults 11000 1726867165.63008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867165.65104: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867165.65187: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867165.65229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867165.65274: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867165.65384: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867165.65395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.65430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.65459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.65510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.65528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.65576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.65612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.65643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.65687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.65706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.65948: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11000 1726867165.66152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.66155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.66157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.66159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.66160: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.66240: variable 'ansible_python' from source: facts 11000 1726867165.66273: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11000 1726867165.66359: variable '__network_wpa_supplicant_required' from source: role '' defaults 11000 1726867165.66449: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11000 1726867165.66586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.66616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.66646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.66693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.66714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.66765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867165.66810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867165.66840: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.66910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867165.66913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867165.67051: variable 'network_connections' from source: task vars 11000 1726867165.67064: variable 'port2_profile' from source: play vars 11000 1726867165.67144: variable 'port2_profile' from source: play vars 11000 1726867165.67183: variable 'port1_profile' from source: play vars 11000 1726867165.67244: variable 'port1_profile' from source: play vars 11000 1726867165.67262: variable 'controller_profile' from source: play vars 11000 1726867165.67346: variable 'controller_profile' from source: play vars 11000 1726867165.67580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867165.67645: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867165.67706: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867165.67753: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867165.67808: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867165.67871: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867165.67904: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867165.67941: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867165.67973: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867165.68027: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867165.68295: variable 'network_connections' from source: task vars 11000 1726867165.68306: variable 'port2_profile' from source: play vars 11000 1726867165.68389: variable 'port2_profile' from source: play vars 11000 1726867165.68405: variable 'port1_profile' from source: play vars 11000 1726867165.68484: variable 'port1_profile' from source: play vars 11000 1726867165.68500: variable 'controller_profile' from source: play vars 11000 1726867165.68575: variable 'controller_profile' from source: play vars 11000 1726867165.68613: variable '__network_packages_default_wireless' from source: role '' defaults 11000 1726867165.68699: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867165.68992: variable 'network_connections' from source: task vars 11000 1726867165.69112: variable 'port2_profile' from source: play vars 11000 1726867165.69115: variable 'port2_profile' from source: play vars 11000 1726867165.69118: variable 'port1_profile' from source: play vars 11000 1726867165.69157: variable 'port1_profile' from source: play vars 11000 1726867165.69169: variable 'controller_profile' from source: play vars 11000 1726867165.69246: variable 'controller_profile' from source: play vars 11000 1726867165.69273: variable '__network_packages_default_team' from source: role '' defaults 11000 1726867165.69357: variable '__network_team_connections_defined' from source: role '' defaults 11000 1726867165.69644: variable 'network_connections' from source: task vars 11000 1726867165.69658: variable 'port2_profile' from source: play vars 11000 1726867165.69728: variable 'port2_profile' from source: play vars 11000 1726867165.69741: variable 'port1_profile' from source: play vars 11000 1726867165.69815: variable 'port1_profile' from source: play vars 11000 1726867165.69827: variable 'controller_profile' from source: play vars 11000 1726867165.69902: variable 'controller_profile' from source: play vars 11000 1726867165.69956: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867165.70023: variable '__network_service_name_default_initscripts' from source: role '' defaults 11000 1726867165.70035: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867165.70101: variable '__network_packages_default_initscripts' from source: role '' defaults 11000 1726867165.70315: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11000 1726867165.70818: variable 'network_connections' from source: task vars 11000 1726867165.70828: variable 'port2_profile' from source: play vars 11000 1726867165.70952: variable 'port2_profile' from source: play vars 11000 1726867165.70956: variable 'port1_profile' from source: play vars 11000 1726867165.70973: variable 'port1_profile' from source: play vars 11000 1726867165.70989: variable 'controller_profile' from source: play vars 11000 1726867165.71049: variable 'controller_profile' from source: play vars 11000 1726867165.71067: variable 'ansible_distribution' from source: facts 11000 1726867165.71076: variable '__network_rh_distros' from source: role '' defaults 11000 1726867165.71092: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.71113: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11000 1726867165.71274: variable 'ansible_distribution' from source: facts 11000 1726867165.71289: variable '__network_rh_distros' from source: role '' defaults 11000 1726867165.71382: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.71388: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11000 1726867165.71489: variable 'ansible_distribution' from source: facts 11000 1726867165.71503: variable '__network_rh_distros' from source: role '' defaults 11000 1726867165.71513: variable 'ansible_distribution_major_version' from source: facts 11000 1726867165.71551: variable 'network_provider' from source: set_fact 11000 1726867165.71581: variable 'omit' from source: magic vars 11000 1726867165.71618: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867165.71651: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867165.71675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867165.71698: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867165.71715: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867165.71752: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867165.71782: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.71785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.71875: Set connection var ansible_shell_type to sh 11000 1726867165.71940: Set connection var ansible_pipelining to False 11000 1726867165.71943: Set connection var ansible_shell_executable to /bin/sh 11000 1726867165.71946: Set connection var ansible_connection to ssh 11000 1726867165.71948: Set connection var ansible_timeout to 10 11000 1726867165.71950: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867165.71956: variable 'ansible_shell_executable' from source: unknown 11000 1726867165.71964: variable 'ansible_connection' from source: unknown 11000 1726867165.71971: variable 'ansible_module_compression' from source: unknown 11000 1726867165.71979: variable 'ansible_shell_type' from source: unknown 11000 1726867165.71986: variable 'ansible_shell_executable' from source: unknown 11000 1726867165.71992: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867165.72000: variable 'ansible_pipelining' from source: unknown 11000 1726867165.72006: variable 'ansible_timeout' from source: unknown 11000 1726867165.72014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867165.72159: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867165.72163: variable 'omit' from source: magic vars 11000 1726867165.72165: starting attempt loop 11000 1726867165.72167: running the handler 11000 1726867165.72234: variable 'ansible_facts' from source: unknown 11000 1726867165.72993: _low_level_execute_command(): starting 11000 1726867165.73007: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867165.73692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867165.73707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867165.73719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867165.73733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867165.73751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867165.73760: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867165.73773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867165.73796: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867165.73858: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867165.73892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867165.73917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867165.73936: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867165.74021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867165.75695: stdout chunk (state=3): >>>/root <<< 11000 1726867165.75867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867165.75870: stdout chunk (state=3): >>><<< 11000 1726867165.75873: stderr chunk (state=3): >>><<< 11000 1726867165.75982: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867165.75987: _low_level_execute_command(): starting 11000 1726867165.75990: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662 `" && echo ansible-tmp-1726867165.758975-12247-197847280284662="` echo /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662 `" ) && sleep 0' 11000 1726867165.76615: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867165.76629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867165.76653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867165.76679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867165.76725: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867165.76841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867165.76844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867165.76905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867165.76987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867165.78830: stdout chunk (state=3): >>>ansible-tmp-1726867165.758975-12247-197847280284662=/root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662 <<< 11000 1726867165.78993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867165.78997: stdout chunk (state=3): >>><<< 11000 1726867165.78999: stderr chunk (state=3): >>><<< 11000 1726867165.79016: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867165.758975-12247-197847280284662=/root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867165.79064: variable 'ansible_module_compression' from source: unknown 11000 1726867165.79182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11000 1726867165.79186: variable 'ansible_facts' from source: unknown 11000 1726867165.79393: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/AnsiballZ_systemd.py 11000 1726867165.79536: Sending initial data 11000 1726867165.79652: Sent initial data (155 bytes) 11000 1726867165.80202: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11000 1726867165.80244: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867165.80247: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867165.80295: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867165.80309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867165.80332: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867165.80335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867165.80421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867165.81955: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867165.82013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867165.82079: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmp3epjfqat /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/AnsiballZ_systemd.py <<< 11000 1726867165.82083: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/AnsiballZ_systemd.py" <<< 11000 1726867165.82132: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmp3epjfqat" to remote "/root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/AnsiballZ_systemd.py" <<< 11000 1726867165.83834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867165.84007: stderr chunk (state=3): >>><<< 11000 1726867165.84010: stdout chunk (state=3): >>><<< 11000 1726867165.84012: done transferring module to remote 11000 1726867165.84014: _low_level_execute_command(): starting 11000 1726867165.84017: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/ /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/AnsiballZ_systemd.py && sleep 0' 11000 1726867165.84605: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867165.84624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867165.84699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867165.84760: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867165.84790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867165.84864: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867165.86649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867165.86653: stdout chunk (state=3): >>><<< 11000 1726867165.86659: stderr chunk (state=3): >>><<< 11000 1726867165.86672: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867165.86676: _low_level_execute_command(): starting 11000 1726867165.86883: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/AnsiballZ_systemd.py && sleep 0' 11000 1726867165.87242: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867165.87251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867165.87262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867165.87275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867165.87293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867165.87300: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867165.87306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867165.87327: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867165.87335: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867165.87341: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867165.87350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867165.87359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867165.87371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867165.87380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867165.87387: stderr chunk (state=3): >>>debug2: match found <<< 11000 1726867165.87443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867165.87473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867165.87494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867165.87514: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867165.87583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867166.16219: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10600448", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297619968", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "497355000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11000 1726867166.16257: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11000 1726867166.18238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867166.18242: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 11000 1726867166.18310: stderr chunk (state=3): >>><<< 11000 1726867166.18313: stdout chunk (state=3): >>><<< 11000 1726867166.18329: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "700", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainStartTimestampMonotonic": "14926291", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ExecMainHandoffTimestampMonotonic": "14939781", "ExecMainPID": "700", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10600448", "MemoryPeak": "14745600", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3297619968", "EffectiveMemoryMax": "3702865920", "EffectiveMemoryHigh": "3702865920", "CPUUsageNSec": "497355000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice dbus.socket sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target cloud-init.service multi-user.target NetworkManager-wait-online.service network.target", "After": "dbus-broker.service system.slice dbus.socket cloud-init-local.service systemd-journald.socket network-pre.target sysinit.target basic.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 17:19:04 EDT", "StateChangeTimestampMonotonic": "389647514", "InactiveExitTimestamp": "Fri 2024-09-20 17:12:48 EDT", "InactiveExitTimestampMonotonic": "14926806", "ActiveEnterTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ActiveEnterTimestampMonotonic": "15147389", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 17:12:48 EDT", "ConditionTimestampMonotonic": "14925363", "AssertTimestamp": "Fri 2024-09-20 17:12:48 EDT", "AssertTimestampMonotonic": "14925366", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b0b064de3fd6461fb15e6ed03d93664a", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867166.18624: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867166.18628: _low_level_execute_command(): starting 11000 1726867166.18630: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867165.758975-12247-197847280284662/ > /dev/null 2>&1 && sleep 0' 11000 1726867166.19160: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867166.19174: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867166.19195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867166.19298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867166.19311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867166.19327: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867166.19348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867166.19421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867166.21283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867166.21310: stdout chunk (state=3): >>><<< 11000 1726867166.21324: stderr chunk (state=3): >>><<< 11000 1726867166.21345: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867166.21360: handler run complete 11000 1726867166.21444: attempt loop complete, returning result 11000 1726867166.21451: _execute() done 11000 1726867166.21457: dumping result to json 11000 1726867166.21479: done dumping result, returning 11000 1726867166.21496: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcac9-a3a5-c734-026a-000000000089] 11000 1726867166.21512: sending task result for task 0affcac9-a3a5-c734-026a-000000000089 11000 1726867166.21846: done sending task result for task 0affcac9-a3a5-c734-026a-000000000089 11000 1726867166.21849: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867166.21910: no more pending results, returning what we have 11000 1726867166.21914: results queue empty 11000 1726867166.21915: checking for any_errors_fatal 11000 1726867166.21921: done checking for any_errors_fatal 11000 1726867166.21921: checking for max_fail_percentage 11000 1726867166.21923: done checking for max_fail_percentage 11000 1726867166.21924: checking to see if all hosts have failed and the running result is not ok 11000 1726867166.21924: done checking to see if all hosts have failed 11000 1726867166.21925: getting the remaining hosts for this loop 11000 1726867166.21926: done getting the remaining hosts for this loop 11000 1726867166.21929: getting the next task for host managed_node1 11000 1726867166.21937: done getting next task for host managed_node1 11000 1726867166.21941: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11000 1726867166.21945: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867166.21956: getting variables 11000 1726867166.21958: in VariableManager get_vars() 11000 1726867166.21999: Calling all_inventory to load vars for managed_node1 11000 1726867166.22002: Calling groups_inventory to load vars for managed_node1 11000 1726867166.22004: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867166.22015: Calling all_plugins_play to load vars for managed_node1 11000 1726867166.22017: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867166.22020: Calling groups_plugins_play to load vars for managed_node1 11000 1726867166.23795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867166.25422: done with get_vars() 11000 1726867166.25452: done getting variables 11000 1726867166.25521: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 17:19:26 -0400 (0:00:00.647) 0:00:27.898 ****** 11000 1726867166.25566: entering _queue_task() for managed_node1/service 11000 1726867166.25993: worker is 1 (out of 1 available) 11000 1726867166.26004: exiting _queue_task() for managed_node1/service 11000 1726867166.26015: done queuing things up, now waiting for results queue to drain 11000 1726867166.26016: waiting for pending results... 11000 1726867166.26363: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11000 1726867166.26368: in run() - task 0affcac9-a3a5-c734-026a-00000000008a 11000 1726867166.26371: variable 'ansible_search_path' from source: unknown 11000 1726867166.26374: variable 'ansible_search_path' from source: unknown 11000 1726867166.26418: calling self._execute() 11000 1726867166.26571: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867166.26575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867166.26579: variable 'omit' from source: magic vars 11000 1726867166.26935: variable 'ansible_distribution_major_version' from source: facts 11000 1726867166.26953: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867166.27080: variable 'network_provider' from source: set_fact 11000 1726867166.27095: Evaluated conditional (network_provider == "nm"): True 11000 1726867166.27222: variable '__network_wpa_supplicant_required' from source: role '' defaults 11000 1726867166.27300: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11000 1726867166.27483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867166.29952: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867166.29991: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867166.30034: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867166.30082: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867166.30115: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867166.30275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867166.30280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867166.30284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867166.30320: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867166.30340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867166.30402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867166.30483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867166.30486: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867166.30516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867166.30538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867166.30584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867166.30623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867166.30654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867166.30702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867166.30730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867166.30884: variable 'network_connections' from source: task vars 11000 1726867166.30946: variable 'port2_profile' from source: play vars 11000 1726867166.30983: variable 'port2_profile' from source: play vars 11000 1726867166.31004: variable 'port1_profile' from source: play vars 11000 1726867166.31073: variable 'port1_profile' from source: play vars 11000 1726867166.31093: variable 'controller_profile' from source: play vars 11000 1726867166.31161: variable 'controller_profile' from source: play vars 11000 1726867166.31269: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11000 1726867166.31423: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11000 1726867166.31460: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11000 1726867166.31500: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11000 1726867166.31529: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11000 1726867166.31572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11000 1726867166.31606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11000 1726867166.31633: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867166.31660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11000 1726867166.31719: variable '__network_wireless_connections_defined' from source: role '' defaults 11000 1726867166.31952: variable 'network_connections' from source: task vars 11000 1726867166.31963: variable 'port2_profile' from source: play vars 11000 1726867166.32036: variable 'port2_profile' from source: play vars 11000 1726867166.32053: variable 'port1_profile' from source: play vars 11000 1726867166.32148: variable 'port1_profile' from source: play vars 11000 1726867166.32159: variable 'controller_profile' from source: play vars 11000 1726867166.32255: variable 'controller_profile' from source: play vars 11000 1726867166.32259: Evaluated conditional (__network_wpa_supplicant_required): False 11000 1726867166.32261: when evaluation is False, skipping this task 11000 1726867166.32263: _execute() done 11000 1726867166.32266: dumping result to json 11000 1726867166.32271: done dumping result, returning 11000 1726867166.32287: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcac9-a3a5-c734-026a-00000000008a] 11000 1726867166.32300: sending task result for task 0affcac9-a3a5-c734-026a-00000000008a 11000 1726867166.32440: done sending task result for task 0affcac9-a3a5-c734-026a-00000000008a 11000 1726867166.32443: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11000 1726867166.32501: no more pending results, returning what we have 11000 1726867166.32505: results queue empty 11000 1726867166.32506: checking for any_errors_fatal 11000 1726867166.32527: done checking for any_errors_fatal 11000 1726867166.32528: checking for max_fail_percentage 11000 1726867166.32530: done checking for max_fail_percentage 11000 1726867166.32531: checking to see if all hosts have failed and the running result is not ok 11000 1726867166.32533: done checking to see if all hosts have failed 11000 1726867166.32533: getting the remaining hosts for this loop 11000 1726867166.32535: done getting the remaining hosts for this loop 11000 1726867166.32539: getting the next task for host managed_node1 11000 1726867166.32545: done getting next task for host managed_node1 11000 1726867166.32549: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11000 1726867166.32553: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867166.32574: getting variables 11000 1726867166.32575: in VariableManager get_vars() 11000 1726867166.32624: Calling all_inventory to load vars for managed_node1 11000 1726867166.32627: Calling groups_inventory to load vars for managed_node1 11000 1726867166.32630: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867166.32641: Calling all_plugins_play to load vars for managed_node1 11000 1726867166.32644: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867166.32647: Calling groups_plugins_play to load vars for managed_node1 11000 1726867166.34448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867166.36017: done with get_vars() 11000 1726867166.36036: done getting variables 11000 1726867166.36096: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 17:19:26 -0400 (0:00:00.105) 0:00:28.004 ****** 11000 1726867166.36125: entering _queue_task() for managed_node1/service 11000 1726867166.36519: worker is 1 (out of 1 available) 11000 1726867166.36530: exiting _queue_task() for managed_node1/service 11000 1726867166.36541: done queuing things up, now waiting for results queue to drain 11000 1726867166.36543: waiting for pending results... 11000 1726867166.36825: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11000 1726867166.36922: in run() - task 0affcac9-a3a5-c734-026a-00000000008b 11000 1726867166.36926: variable 'ansible_search_path' from source: unknown 11000 1726867166.36928: variable 'ansible_search_path' from source: unknown 11000 1726867166.36939: calling self._execute() 11000 1726867166.37046: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867166.37056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867166.37069: variable 'omit' from source: magic vars 11000 1726867166.37457: variable 'ansible_distribution_major_version' from source: facts 11000 1726867166.37486: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867166.37612: variable 'network_provider' from source: set_fact 11000 1726867166.37686: Evaluated conditional (network_provider == "initscripts"): False 11000 1726867166.37691: when evaluation is False, skipping this task 11000 1726867166.37694: _execute() done 11000 1726867166.37696: dumping result to json 11000 1726867166.37698: done dumping result, returning 11000 1726867166.37700: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [0affcac9-a3a5-c734-026a-00000000008b] 11000 1726867166.37703: sending task result for task 0affcac9-a3a5-c734-026a-00000000008b 11000 1726867166.37766: done sending task result for task 0affcac9-a3a5-c734-026a-00000000008b 11000 1726867166.37769: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11000 1726867166.37829: no more pending results, returning what we have 11000 1726867166.37832: results queue empty 11000 1726867166.37833: checking for any_errors_fatal 11000 1726867166.37841: done checking for any_errors_fatal 11000 1726867166.37842: checking for max_fail_percentage 11000 1726867166.37843: done checking for max_fail_percentage 11000 1726867166.37844: checking to see if all hosts have failed and the running result is not ok 11000 1726867166.37845: done checking to see if all hosts have failed 11000 1726867166.37846: getting the remaining hosts for this loop 11000 1726867166.37847: done getting the remaining hosts for this loop 11000 1726867166.37851: getting the next task for host managed_node1 11000 1726867166.37858: done getting next task for host managed_node1 11000 1726867166.37861: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11000 1726867166.37865: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867166.37890: getting variables 11000 1726867166.37891: in VariableManager get_vars() 11000 1726867166.37929: Calling all_inventory to load vars for managed_node1 11000 1726867166.37932: Calling groups_inventory to load vars for managed_node1 11000 1726867166.37934: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867166.37946: Calling all_plugins_play to load vars for managed_node1 11000 1726867166.37949: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867166.37951: Calling groups_plugins_play to load vars for managed_node1 11000 1726867166.39462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867166.41098: done with get_vars() 11000 1726867166.41118: done getting variables 11000 1726867166.41181: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 17:19:26 -0400 (0:00:00.050) 0:00:28.055 ****** 11000 1726867166.41220: entering _queue_task() for managed_node1/copy 11000 1726867166.41706: worker is 1 (out of 1 available) 11000 1726867166.41715: exiting _queue_task() for managed_node1/copy 11000 1726867166.41726: done queuing things up, now waiting for results queue to drain 11000 1726867166.41727: waiting for pending results... 11000 1726867166.41966: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11000 1726867166.41973: in run() - task 0affcac9-a3a5-c734-026a-00000000008c 11000 1726867166.41998: variable 'ansible_search_path' from source: unknown 11000 1726867166.42007: variable 'ansible_search_path' from source: unknown 11000 1726867166.42045: calling self._execute() 11000 1726867166.42154: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867166.42172: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867166.42194: variable 'omit' from source: magic vars 11000 1726867166.42559: variable 'ansible_distribution_major_version' from source: facts 11000 1726867166.42575: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867166.42697: variable 'network_provider' from source: set_fact 11000 1726867166.42713: Evaluated conditional (network_provider == "initscripts"): False 11000 1726867166.42723: when evaluation is False, skipping this task 11000 1726867166.42729: _execute() done 11000 1726867166.42734: dumping result to json 11000 1726867166.42740: done dumping result, returning 11000 1726867166.42750: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcac9-a3a5-c734-026a-00000000008c] 11000 1726867166.42758: sending task result for task 0affcac9-a3a5-c734-026a-00000000008c skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11000 1726867166.43016: no more pending results, returning what we have 11000 1726867166.43019: results queue empty 11000 1726867166.43021: checking for any_errors_fatal 11000 1726867166.43027: done checking for any_errors_fatal 11000 1726867166.43030: checking for max_fail_percentage 11000 1726867166.43032: done checking for max_fail_percentage 11000 1726867166.43033: checking to see if all hosts have failed and the running result is not ok 11000 1726867166.43034: done checking to see if all hosts have failed 11000 1726867166.43034: getting the remaining hosts for this loop 11000 1726867166.43036: done getting the remaining hosts for this loop 11000 1726867166.43039: getting the next task for host managed_node1 11000 1726867166.43046: done getting next task for host managed_node1 11000 1726867166.43050: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11000 1726867166.43054: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867166.43074: getting variables 11000 1726867166.43076: in VariableManager get_vars() 11000 1726867166.43120: Calling all_inventory to load vars for managed_node1 11000 1726867166.43123: Calling groups_inventory to load vars for managed_node1 11000 1726867166.43126: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867166.43138: Calling all_plugins_play to load vars for managed_node1 11000 1726867166.43141: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867166.43144: Calling groups_plugins_play to load vars for managed_node1 11000 1726867166.43797: done sending task result for task 0affcac9-a3a5-c734-026a-00000000008c 11000 1726867166.43801: WORKER PROCESS EXITING 11000 1726867166.44701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867166.46301: done with get_vars() 11000 1726867166.46319: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 17:19:26 -0400 (0:00:00.051) 0:00:28.107 ****** 11000 1726867166.46411: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11000 1726867166.46796: worker is 1 (out of 1 available) 11000 1726867166.46807: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11000 1726867166.46818: done queuing things up, now waiting for results queue to drain 11000 1726867166.46819: waiting for pending results... 11000 1726867166.46996: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11000 1726867166.47155: in run() - task 0affcac9-a3a5-c734-026a-00000000008d 11000 1726867166.47175: variable 'ansible_search_path' from source: unknown 11000 1726867166.47187: variable 'ansible_search_path' from source: unknown 11000 1726867166.47231: calling self._execute() 11000 1726867166.47340: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867166.47352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867166.47372: variable 'omit' from source: magic vars 11000 1726867166.47752: variable 'ansible_distribution_major_version' from source: facts 11000 1726867166.47775: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867166.47792: variable 'omit' from source: magic vars 11000 1726867166.47861: variable 'omit' from source: magic vars 11000 1726867166.48043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11000 1726867166.50232: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11000 1726867166.50386: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11000 1726867166.50392: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11000 1726867166.50395: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11000 1726867166.50422: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11000 1726867166.50506: variable 'network_provider' from source: set_fact 11000 1726867166.50635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11000 1726867166.50686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11000 1726867166.50729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11000 1726867166.50775: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11000 1726867166.50802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11000 1726867166.50884: variable 'omit' from source: magic vars 11000 1726867166.51008: variable 'omit' from source: magic vars 11000 1726867166.51149: variable 'network_connections' from source: task vars 11000 1726867166.51153: variable 'port2_profile' from source: play vars 11000 1726867166.51200: variable 'port2_profile' from source: play vars 11000 1726867166.51215: variable 'port1_profile' from source: play vars 11000 1726867166.51282: variable 'port1_profile' from source: play vars 11000 1726867166.51300: variable 'controller_profile' from source: play vars 11000 1726867166.51367: variable 'controller_profile' from source: play vars 11000 1726867166.51585: variable 'omit' from source: magic vars 11000 1726867166.51591: variable '__lsr_ansible_managed' from source: task vars 11000 1726867166.51606: variable '__lsr_ansible_managed' from source: task vars 11000 1726867166.51784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11000 1726867166.52029: Loaded config def from plugin (lookup/template) 11000 1726867166.52039: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11000 1726867166.52069: File lookup term: get_ansible_managed.j2 11000 1726867166.52080: variable 'ansible_search_path' from source: unknown 11000 1726867166.52093: evaluation_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11000 1726867166.52110: search_path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11000 1726867166.52182: variable 'ansible_search_path' from source: unknown 11000 1726867166.59601: variable 'ansible_managed' from source: unknown 11000 1726867166.59685: variable 'omit' from source: magic vars 11000 1726867166.59707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867166.59727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867166.59740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867166.59754: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867166.59763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867166.59786: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867166.59791: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867166.59794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867166.59854: Set connection var ansible_shell_type to sh 11000 1726867166.59860: Set connection var ansible_pipelining to False 11000 1726867166.59869: Set connection var ansible_shell_executable to /bin/sh 11000 1726867166.59872: Set connection var ansible_connection to ssh 11000 1726867166.59875: Set connection var ansible_timeout to 10 11000 1726867166.59882: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867166.59903: variable 'ansible_shell_executable' from source: unknown 11000 1726867166.59906: variable 'ansible_connection' from source: unknown 11000 1726867166.59909: variable 'ansible_module_compression' from source: unknown 11000 1726867166.59911: variable 'ansible_shell_type' from source: unknown 11000 1726867166.59915: variable 'ansible_shell_executable' from source: unknown 11000 1726867166.59917: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867166.59919: variable 'ansible_pipelining' from source: unknown 11000 1726867166.59921: variable 'ansible_timeout' from source: unknown 11000 1726867166.59933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867166.60020: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867166.60028: variable 'omit' from source: magic vars 11000 1726867166.60034: starting attempt loop 11000 1726867166.60037: running the handler 11000 1726867166.60048: _low_level_execute_command(): starting 11000 1726867166.60055: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867166.60523: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867166.60527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867166.60529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867166.60531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867166.60570: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867166.60573: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867166.60654: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867166.62348: stdout chunk (state=3): >>>/root <<< 11000 1726867166.62507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867166.62510: stdout chunk (state=3): >>><<< 11000 1726867166.62513: stderr chunk (state=3): >>><<< 11000 1726867166.62631: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867166.62635: _low_level_execute_command(): starting 11000 1726867166.62637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513 `" && echo ansible-tmp-1726867166.6253939-12277-507951606513="` echo /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513 `" ) && sleep 0' 11000 1726867166.63185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867166.63202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867166.63223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867166.63241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867166.63257: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867166.63267: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867166.63282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867166.63332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867166.63401: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867166.63437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867166.63449: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867166.63519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867166.65417: stdout chunk (state=3): >>>ansible-tmp-1726867166.6253939-12277-507951606513=/root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513 <<< 11000 1726867166.65564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867166.65576: stdout chunk (state=3): >>><<< 11000 1726867166.65599: stderr chunk (state=3): >>><<< 11000 1726867166.65617: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867166.6253939-12277-507951606513=/root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867166.65791: variable 'ansible_module_compression' from source: unknown 11000 1726867166.65795: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11000 1726867166.65797: variable 'ansible_facts' from source: unknown 11000 1726867166.65933: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/AnsiballZ_network_connections.py 11000 1726867166.66132: Sending initial data 11000 1726867166.66141: Sent initial data (165 bytes) 11000 1726867166.66765: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867166.66790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867166.66805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867166.66824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867166.66899: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867166.66947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867166.66965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867166.67004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867166.67080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867166.68714: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867166.68774: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867166.68840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpvxmifs5c /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/AnsiballZ_network_connections.py <<< 11000 1726867166.68845: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/AnsiballZ_network_connections.py" <<< 11000 1726867166.68900: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpvxmifs5c" to remote "/root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/AnsiballZ_network_connections.py" <<< 11000 1726867166.69991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867166.70117: stderr chunk (state=3): >>><<< 11000 1726867166.70120: stdout chunk (state=3): >>><<< 11000 1726867166.70122: done transferring module to remote 11000 1726867166.70124: _low_level_execute_command(): starting 11000 1726867166.70126: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/ /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/AnsiballZ_network_connections.py && sleep 0' 11000 1726867166.70706: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867166.70737: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867166.70751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867166.70840: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867166.70975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867166.71099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867166.71153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867166.72944: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867166.73000: stderr chunk (state=3): >>><<< 11000 1726867166.73010: stdout chunk (state=3): >>><<< 11000 1726867166.73029: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867166.73111: _low_level_execute_command(): starting 11000 1726867166.73114: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/AnsiballZ_network_connections.py && sleep 0' 11000 1726867166.73645: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867166.73718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867166.73842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867166.74096: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867166.74331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867167.26021: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 11000 1726867167.26083: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/a97880b0-dde0-4b51-ba2e-4449038703da: error=unknown <<< 11000 1726867167.27735: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 11000 1726867167.27770: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail<<< 11000 1726867167.27785: stdout chunk (state=3): >>> ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/f31f4939-dccd-4694-8e6c-e832cbfb865b: error=unknown <<< 11000 1726867167.29474: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a61dfef3-6218-4c4f-ba0f-002676378e96: error=unknown <<< 11000 1726867167.29751: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11000 1726867167.31896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867167.31899: stdout chunk (state=3): >>><<< 11000 1726867167.31902: stderr chunk (state=3): >>><<< 11000 1726867167.31921: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/a97880b0-dde0-4b51-ba2e-4449038703da: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/f31f4939-dccd-4694-8e6c-e832cbfb865b: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ddr4pl53/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/a61dfef3-6218-4c4f-ba0f-002676378e96: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867167.32060: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867167.32063: _low_level_execute_command(): starting 11000 1726867167.32065: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867166.6253939-12277-507951606513/ > /dev/null 2>&1 && sleep 0' 11000 1726867167.32611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867167.32628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867167.32643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867167.32659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867167.32681: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867167.32693: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867167.32745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867167.32806: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867167.32824: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867167.32854: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867167.32948: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867167.34834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867167.34837: stdout chunk (state=3): >>><<< 11000 1726867167.34982: stderr chunk (state=3): >>><<< 11000 1726867167.34998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867167.35001: handler run complete 11000 1726867167.35003: attempt loop complete, returning result 11000 1726867167.35004: _execute() done 11000 1726867167.35006: dumping result to json 11000 1726867167.35007: done dumping result, returning 11000 1726867167.35009: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcac9-a3a5-c734-026a-00000000008d] 11000 1726867167.35011: sending task result for task 0affcac9-a3a5-c734-026a-00000000008d 11000 1726867167.35080: done sending task result for task 0affcac9-a3a5-c734-026a-00000000008d 11000 1726867167.35083: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11000 1726867167.35213: no more pending results, returning what we have 11000 1726867167.35217: results queue empty 11000 1726867167.35218: checking for any_errors_fatal 11000 1726867167.35224: done checking for any_errors_fatal 11000 1726867167.35225: checking for max_fail_percentage 11000 1726867167.35227: done checking for max_fail_percentage 11000 1726867167.35228: checking to see if all hosts have failed and the running result is not ok 11000 1726867167.35229: done checking to see if all hosts have failed 11000 1726867167.35229: getting the remaining hosts for this loop 11000 1726867167.35231: done getting the remaining hosts for this loop 11000 1726867167.35235: getting the next task for host managed_node1 11000 1726867167.35242: done getting next task for host managed_node1 11000 1726867167.35246: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11000 1726867167.35251: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867167.35263: getting variables 11000 1726867167.35265: in VariableManager get_vars() 11000 1726867167.35495: Calling all_inventory to load vars for managed_node1 11000 1726867167.35498: Calling groups_inventory to load vars for managed_node1 11000 1726867167.35504: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867167.35513: Calling all_plugins_play to load vars for managed_node1 11000 1726867167.35516: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867167.35518: Calling groups_plugins_play to load vars for managed_node1 11000 1726867167.36849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867167.38613: done with get_vars() 11000 1726867167.38634: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 17:19:27 -0400 (0:00:00.923) 0:00:29.030 ****** 11000 1726867167.38731: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11000 1726867167.39206: worker is 1 (out of 1 available) 11000 1726867167.39218: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11000 1726867167.39231: done queuing things up, now waiting for results queue to drain 11000 1726867167.39232: waiting for pending results... 11000 1726867167.39413: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11000 1726867167.39583: in run() - task 0affcac9-a3a5-c734-026a-00000000008e 11000 1726867167.39605: variable 'ansible_search_path' from source: unknown 11000 1726867167.39614: variable 'ansible_search_path' from source: unknown 11000 1726867167.39683: calling self._execute() 11000 1726867167.39766: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.39781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.39884: variable 'omit' from source: magic vars 11000 1726867167.40195: variable 'ansible_distribution_major_version' from source: facts 11000 1726867167.40214: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867167.40346: variable 'network_state' from source: role '' defaults 11000 1726867167.40361: Evaluated conditional (network_state != {}): False 11000 1726867167.40369: when evaluation is False, skipping this task 11000 1726867167.40376: _execute() done 11000 1726867167.40387: dumping result to json 11000 1726867167.40394: done dumping result, returning 11000 1726867167.40406: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [0affcac9-a3a5-c734-026a-00000000008e] 11000 1726867167.40415: sending task result for task 0affcac9-a3a5-c734-026a-00000000008e 11000 1726867167.40788: done sending task result for task 0affcac9-a3a5-c734-026a-00000000008e 11000 1726867167.40791: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11000 1726867167.40831: no more pending results, returning what we have 11000 1726867167.40835: results queue empty 11000 1726867167.40835: checking for any_errors_fatal 11000 1726867167.40843: done checking for any_errors_fatal 11000 1726867167.40844: checking for max_fail_percentage 11000 1726867167.40845: done checking for max_fail_percentage 11000 1726867167.40846: checking to see if all hosts have failed and the running result is not ok 11000 1726867167.40847: done checking to see if all hosts have failed 11000 1726867167.40847: getting the remaining hosts for this loop 11000 1726867167.40848: done getting the remaining hosts for this loop 11000 1726867167.40851: getting the next task for host managed_node1 11000 1726867167.40857: done getting next task for host managed_node1 11000 1726867167.40860: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11000 1726867167.40864: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867167.40881: getting variables 11000 1726867167.40883: in VariableManager get_vars() 11000 1726867167.40921: Calling all_inventory to load vars for managed_node1 11000 1726867167.40924: Calling groups_inventory to load vars for managed_node1 11000 1726867167.40926: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867167.40935: Calling all_plugins_play to load vars for managed_node1 11000 1726867167.40938: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867167.40940: Calling groups_plugins_play to load vars for managed_node1 11000 1726867167.42254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867167.43886: done with get_vars() 11000 1726867167.43919: done getting variables 11000 1726867167.43990: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 17:19:27 -0400 (0:00:00.052) 0:00:29.083 ****** 11000 1726867167.44028: entering _queue_task() for managed_node1/debug 11000 1726867167.44503: worker is 1 (out of 1 available) 11000 1726867167.44516: exiting _queue_task() for managed_node1/debug 11000 1726867167.44528: done queuing things up, now waiting for results queue to drain 11000 1726867167.44529: waiting for pending results... 11000 1726867167.44740: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11000 1726867167.44923: in run() - task 0affcac9-a3a5-c734-026a-00000000008f 11000 1726867167.44947: variable 'ansible_search_path' from source: unknown 11000 1726867167.44956: variable 'ansible_search_path' from source: unknown 11000 1726867167.45005: calling self._execute() 11000 1726867167.45127: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.45141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.45156: variable 'omit' from source: magic vars 11000 1726867167.45547: variable 'ansible_distribution_major_version' from source: facts 11000 1726867167.45571: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867167.45676: variable 'omit' from source: magic vars 11000 1726867167.45681: variable 'omit' from source: magic vars 11000 1726867167.45700: variable 'omit' from source: magic vars 11000 1726867167.45744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867167.45792: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867167.45818: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867167.45883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867167.45891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867167.45897: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867167.45910: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.45919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.46026: Set connection var ansible_shell_type to sh 11000 1726867167.46041: Set connection var ansible_pipelining to False 11000 1726867167.46056: Set connection var ansible_shell_executable to /bin/sh 11000 1726867167.46063: Set connection var ansible_connection to ssh 11000 1726867167.46111: Set connection var ansible_timeout to 10 11000 1726867167.46114: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867167.46122: variable 'ansible_shell_executable' from source: unknown 11000 1726867167.46131: variable 'ansible_connection' from source: unknown 11000 1726867167.46138: variable 'ansible_module_compression' from source: unknown 11000 1726867167.46145: variable 'ansible_shell_type' from source: unknown 11000 1726867167.46151: variable 'ansible_shell_executable' from source: unknown 11000 1726867167.46158: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.46164: variable 'ansible_pipelining' from source: unknown 11000 1726867167.46220: variable 'ansible_timeout' from source: unknown 11000 1726867167.46223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.46335: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867167.46439: variable 'omit' from source: magic vars 11000 1726867167.46442: starting attempt loop 11000 1726867167.46446: running the handler 11000 1726867167.46500: variable '__network_connections_result' from source: set_fact 11000 1726867167.46563: handler run complete 11000 1726867167.46590: attempt loop complete, returning result 11000 1726867167.46599: _execute() done 11000 1726867167.46607: dumping result to json 11000 1726867167.46615: done dumping result, returning 11000 1726867167.46628: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcac9-a3a5-c734-026a-00000000008f] 11000 1726867167.46638: sending task result for task 0affcac9-a3a5-c734-026a-00000000008f ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 11000 1726867167.46834: no more pending results, returning what we have 11000 1726867167.46837: results queue empty 11000 1726867167.46839: checking for any_errors_fatal 11000 1726867167.46845: done checking for any_errors_fatal 11000 1726867167.46845: checking for max_fail_percentage 11000 1726867167.46847: done checking for max_fail_percentage 11000 1726867167.46848: checking to see if all hosts have failed and the running result is not ok 11000 1726867167.46849: done checking to see if all hosts have failed 11000 1726867167.46849: getting the remaining hosts for this loop 11000 1726867167.46851: done getting the remaining hosts for this loop 11000 1726867167.46854: getting the next task for host managed_node1 11000 1726867167.46860: done getting next task for host managed_node1 11000 1726867167.46864: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11000 1726867167.46868: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867167.46881: getting variables 11000 1726867167.46882: in VariableManager get_vars() 11000 1726867167.46920: Calling all_inventory to load vars for managed_node1 11000 1726867167.46923: Calling groups_inventory to load vars for managed_node1 11000 1726867167.46925: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867167.46935: Calling all_plugins_play to load vars for managed_node1 11000 1726867167.46937: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867167.46940: Calling groups_plugins_play to load vars for managed_node1 11000 1726867167.47592: done sending task result for task 0affcac9-a3a5-c734-026a-00000000008f 11000 1726867167.47596: WORKER PROCESS EXITING 11000 1726867167.48681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867167.50200: done with get_vars() 11000 1726867167.50220: done getting variables 11000 1726867167.50283: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 17:19:27 -0400 (0:00:00.062) 0:00:29.146 ****** 11000 1726867167.50319: entering _queue_task() for managed_node1/debug 11000 1726867167.50699: worker is 1 (out of 1 available) 11000 1726867167.50710: exiting _queue_task() for managed_node1/debug 11000 1726867167.50720: done queuing things up, now waiting for results queue to drain 11000 1726867167.50721: waiting for pending results... 11000 1726867167.50919: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11000 1726867167.51071: in run() - task 0affcac9-a3a5-c734-026a-000000000090 11000 1726867167.51096: variable 'ansible_search_path' from source: unknown 11000 1726867167.51107: variable 'ansible_search_path' from source: unknown 11000 1726867167.51144: calling self._execute() 11000 1726867167.51247: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.51258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.51276: variable 'omit' from source: magic vars 11000 1726867167.51656: variable 'ansible_distribution_major_version' from source: facts 11000 1726867167.51672: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867167.51686: variable 'omit' from source: magic vars 11000 1726867167.51782: variable 'omit' from source: magic vars 11000 1726867167.51805: variable 'omit' from source: magic vars 11000 1726867167.51847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867167.51894: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867167.51973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867167.51978: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867167.51981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867167.51991: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867167.51999: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.52006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.52111: Set connection var ansible_shell_type to sh 11000 1726867167.52124: Set connection var ansible_pipelining to False 11000 1726867167.52136: Set connection var ansible_shell_executable to /bin/sh 11000 1726867167.52142: Set connection var ansible_connection to ssh 11000 1726867167.52151: Set connection var ansible_timeout to 10 11000 1726867167.52159: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867167.52199: variable 'ansible_shell_executable' from source: unknown 11000 1726867167.52292: variable 'ansible_connection' from source: unknown 11000 1726867167.52298: variable 'ansible_module_compression' from source: unknown 11000 1726867167.52300: variable 'ansible_shell_type' from source: unknown 11000 1726867167.52302: variable 'ansible_shell_executable' from source: unknown 11000 1726867167.52304: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.52306: variable 'ansible_pipelining' from source: unknown 11000 1726867167.52308: variable 'ansible_timeout' from source: unknown 11000 1726867167.52310: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.52386: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867167.52407: variable 'omit' from source: magic vars 11000 1726867167.52417: starting attempt loop 11000 1726867167.52509: running the handler 11000 1726867167.52512: variable '__network_connections_result' from source: set_fact 11000 1726867167.52564: variable '__network_connections_result' from source: set_fact 11000 1726867167.52696: handler run complete 11000 1726867167.52732: attempt loop complete, returning result 11000 1726867167.52750: _execute() done 11000 1726867167.52759: dumping result to json 11000 1726867167.52769: done dumping result, returning 11000 1726867167.52784: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcac9-a3a5-c734-026a-000000000090] 11000 1726867167.52848: sending task result for task 0affcac9-a3a5-c734-026a-000000000090 11000 1726867167.52918: done sending task result for task 0affcac9-a3a5-c734-026a-000000000090 11000 1726867167.52922: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11000 1726867167.53047: no more pending results, returning what we have 11000 1726867167.53051: results queue empty 11000 1726867167.53052: checking for any_errors_fatal 11000 1726867167.53059: done checking for any_errors_fatal 11000 1726867167.53060: checking for max_fail_percentage 11000 1726867167.53061: done checking for max_fail_percentage 11000 1726867167.53062: checking to see if all hosts have failed and the running result is not ok 11000 1726867167.53063: done checking to see if all hosts have failed 11000 1726867167.53064: getting the remaining hosts for this loop 11000 1726867167.53065: done getting the remaining hosts for this loop 11000 1726867167.53068: getting the next task for host managed_node1 11000 1726867167.53074: done getting next task for host managed_node1 11000 1726867167.53079: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11000 1726867167.53083: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867167.53096: getting variables 11000 1726867167.53097: in VariableManager get_vars() 11000 1726867167.53133: Calling all_inventory to load vars for managed_node1 11000 1726867167.53136: Calling groups_inventory to load vars for managed_node1 11000 1726867167.53281: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867167.53292: Calling all_plugins_play to load vars for managed_node1 11000 1726867167.53300: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867167.53304: Calling groups_plugins_play to load vars for managed_node1 11000 1726867167.54664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867167.56248: done with get_vars() 11000 1726867167.56267: done getting variables 11000 1726867167.56330: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 17:19:27 -0400 (0:00:00.060) 0:00:29.206 ****** 11000 1726867167.56363: entering _queue_task() for managed_node1/debug 11000 1726867167.56890: worker is 1 (out of 1 available) 11000 1726867167.56897: exiting _queue_task() for managed_node1/debug 11000 1726867167.56907: done queuing things up, now waiting for results queue to drain 11000 1726867167.56910: waiting for pending results... 11000 1726867167.57094: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11000 1726867167.57105: in run() - task 0affcac9-a3a5-c734-026a-000000000091 11000 1726867167.57127: variable 'ansible_search_path' from source: unknown 11000 1726867167.57145: variable 'ansible_search_path' from source: unknown 11000 1726867167.57187: calling self._execute() 11000 1726867167.57357: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.57361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.57364: variable 'omit' from source: magic vars 11000 1726867167.57694: variable 'ansible_distribution_major_version' from source: facts 11000 1726867167.57712: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867167.57838: variable 'network_state' from source: role '' defaults 11000 1726867167.57857: Evaluated conditional (network_state != {}): False 11000 1726867167.57865: when evaluation is False, skipping this task 11000 1726867167.57873: _execute() done 11000 1726867167.57901: dumping result to json 11000 1726867167.57905: done dumping result, returning 11000 1726867167.57930: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcac9-a3a5-c734-026a-000000000091] 11000 1726867167.57937: sending task result for task 0affcac9-a3a5-c734-026a-000000000091 skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11000 1726867167.58084: no more pending results, returning what we have 11000 1726867167.58088: results queue empty 11000 1726867167.58089: checking for any_errors_fatal 11000 1726867167.58100: done checking for any_errors_fatal 11000 1726867167.58100: checking for max_fail_percentage 11000 1726867167.58102: done checking for max_fail_percentage 11000 1726867167.58103: checking to see if all hosts have failed and the running result is not ok 11000 1726867167.58103: done checking to see if all hosts have failed 11000 1726867167.58104: getting the remaining hosts for this loop 11000 1726867167.58105: done getting the remaining hosts for this loop 11000 1726867167.58108: getting the next task for host managed_node1 11000 1726867167.58120: done getting next task for host managed_node1 11000 1726867167.58124: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11000 1726867167.58128: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867167.58144: getting variables 11000 1726867167.58146: in VariableManager get_vars() 11000 1726867167.58187: Calling all_inventory to load vars for managed_node1 11000 1726867167.58191: Calling groups_inventory to load vars for managed_node1 11000 1726867167.58193: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867167.58201: Calling all_plugins_play to load vars for managed_node1 11000 1726867167.58203: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867167.58206: Calling groups_plugins_play to load vars for managed_node1 11000 1726867167.58786: done sending task result for task 0affcac9-a3a5-c734-026a-000000000091 11000 1726867167.58792: WORKER PROCESS EXITING 11000 1726867167.64420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867167.66113: done with get_vars() 11000 1726867167.66133: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 17:19:27 -0400 (0:00:00.098) 0:00:29.305 ****** 11000 1726867167.66228: entering _queue_task() for managed_node1/ping 11000 1726867167.66604: worker is 1 (out of 1 available) 11000 1726867167.66622: exiting _queue_task() for managed_node1/ping 11000 1726867167.66634: done queuing things up, now waiting for results queue to drain 11000 1726867167.66636: waiting for pending results... 11000 1726867167.66842: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11000 1726867167.66966: in run() - task 0affcac9-a3a5-c734-026a-000000000092 11000 1726867167.66982: variable 'ansible_search_path' from source: unknown 11000 1726867167.66986: variable 'ansible_search_path' from source: unknown 11000 1726867167.67045: calling self._execute() 11000 1726867167.67115: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.67118: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.67129: variable 'omit' from source: magic vars 11000 1726867167.67481: variable 'ansible_distribution_major_version' from source: facts 11000 1726867167.67493: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867167.67497: variable 'omit' from source: magic vars 11000 1726867167.67554: variable 'omit' from source: magic vars 11000 1726867167.67592: variable 'omit' from source: magic vars 11000 1726867167.67627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867167.67662: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867167.67697: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867167.67779: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867167.67784: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867167.67787: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867167.67793: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.67795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.67840: Set connection var ansible_shell_type to sh 11000 1726867167.67847: Set connection var ansible_pipelining to False 11000 1726867167.67857: Set connection var ansible_shell_executable to /bin/sh 11000 1726867167.67860: Set connection var ansible_connection to ssh 11000 1726867167.67866: Set connection var ansible_timeout to 10 11000 1726867167.67872: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867167.67913: variable 'ansible_shell_executable' from source: unknown 11000 1726867167.67917: variable 'ansible_connection' from source: unknown 11000 1726867167.68021: variable 'ansible_module_compression' from source: unknown 11000 1726867167.68024: variable 'ansible_shell_type' from source: unknown 11000 1726867167.68028: variable 'ansible_shell_executable' from source: unknown 11000 1726867167.68031: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867167.68033: variable 'ansible_pipelining' from source: unknown 11000 1726867167.68036: variable 'ansible_timeout' from source: unknown 11000 1726867167.68039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867167.68116: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11000 1726867167.68125: variable 'omit' from source: magic vars 11000 1726867167.68129: starting attempt loop 11000 1726867167.68131: running the handler 11000 1726867167.68147: _low_level_execute_command(): starting 11000 1726867167.68264: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867167.68838: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867167.68850: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867167.68863: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867167.68880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867167.68895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867167.69035: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867167.69041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867167.69098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867167.70775: stdout chunk (state=3): >>>/root <<< 11000 1726867167.70936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867167.70940: stdout chunk (state=3): >>><<< 11000 1726867167.70942: stderr chunk (state=3): >>><<< 11000 1726867167.71061: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867167.71065: _low_level_execute_command(): starting 11000 1726867167.71069: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112 `" && echo ansible-tmp-1726867167.7096949-12324-239270715503112="` echo /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112 `" ) && sleep 0' 11000 1726867167.71621: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867167.71682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867167.71722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867167.73604: stdout chunk (state=3): >>>ansible-tmp-1726867167.7096949-12324-239270715503112=/root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112 <<< 11000 1726867167.73760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867167.73764: stdout chunk (state=3): >>><<< 11000 1726867167.73766: stderr chunk (state=3): >>><<< 11000 1726867167.73983: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867167.7096949-12324-239270715503112=/root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867167.73986: variable 'ansible_module_compression' from source: unknown 11000 1726867167.73992: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11000 1726867167.73994: variable 'ansible_facts' from source: unknown 11000 1726867167.74020: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/AnsiballZ_ping.py 11000 1726867167.74201: Sending initial data 11000 1726867167.74210: Sent initial data (153 bytes) 11000 1726867167.74829: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867167.74845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867167.74861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867167.74993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867167.75014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867167.75102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867167.76654: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867167.76695: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867167.76739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpdh_adflk /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/AnsiballZ_ping.py <<< 11000 1726867167.76757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/AnsiballZ_ping.py" <<< 11000 1726867167.76801: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpdh_adflk" to remote "/root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/AnsiballZ_ping.py" <<< 11000 1726867167.77693: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867167.77697: stdout chunk (state=3): >>><<< 11000 1726867167.77699: stderr chunk (state=3): >>><<< 11000 1726867167.77701: done transferring module to remote 11000 1726867167.77703: _low_level_execute_command(): starting 11000 1726867167.77705: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/ /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/AnsiballZ_ping.py && sleep 0' 11000 1726867167.78294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867167.78297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867167.78363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867167.78391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867167.78432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867167.78469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867167.80291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867167.80294: stdout chunk (state=3): >>><<< 11000 1726867167.80296: stderr chunk (state=3): >>><<< 11000 1726867167.80383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867167.80390: _low_level_execute_command(): starting 11000 1726867167.80393: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/AnsiballZ_ping.py && sleep 0' 11000 1726867167.80964: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867167.80981: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867167.80994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867167.81044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867167.81061: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867167.81090: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867167.81162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867167.81200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867167.81203: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867167.81306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867167.96328: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11000 1726867167.97642: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867167.97668: stderr chunk (state=3): >>><<< 11000 1726867167.97671: stdout chunk (state=3): >>><<< 11000 1726867167.97693: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867167.97713: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867167.97722: _low_level_execute_command(): starting 11000 1726867167.97726: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867167.7096949-12324-239270715503112/ > /dev/null 2>&1 && sleep 0' 11000 1726867167.98146: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867167.98149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867167.98152: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867167.98154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867167.98207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867167.98213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867167.98257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.00057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.00081: stderr chunk (state=3): >>><<< 11000 1726867168.00085: stdout chunk (state=3): >>><<< 11000 1726867168.00101: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.00106: handler run complete 11000 1726867168.00117: attempt loop complete, returning result 11000 1726867168.00120: _execute() done 11000 1726867168.00122: dumping result to json 11000 1726867168.00126: done dumping result, returning 11000 1726867168.00136: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcac9-a3a5-c734-026a-000000000092] 11000 1726867168.00138: sending task result for task 0affcac9-a3a5-c734-026a-000000000092 11000 1726867168.00230: done sending task result for task 0affcac9-a3a5-c734-026a-000000000092 11000 1726867168.00233: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 11000 1726867168.00292: no more pending results, returning what we have 11000 1726867168.00295: results queue empty 11000 1726867168.00296: checking for any_errors_fatal 11000 1726867168.00305: done checking for any_errors_fatal 11000 1726867168.00306: checking for max_fail_percentage 11000 1726867168.00307: done checking for max_fail_percentage 11000 1726867168.00308: checking to see if all hosts have failed and the running result is not ok 11000 1726867168.00309: done checking to see if all hosts have failed 11000 1726867168.00309: getting the remaining hosts for this loop 11000 1726867168.00311: done getting the remaining hosts for this loop 11000 1726867168.00314: getting the next task for host managed_node1 11000 1726867168.00323: done getting next task for host managed_node1 11000 1726867168.00325: ^ task is: TASK: meta (role_complete) 11000 1726867168.00328: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867168.00340: getting variables 11000 1726867168.00342: in VariableManager get_vars() 11000 1726867168.00382: Calling all_inventory to load vars for managed_node1 11000 1726867168.00385: Calling groups_inventory to load vars for managed_node1 11000 1726867168.00387: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867168.00397: Calling all_plugins_play to load vars for managed_node1 11000 1726867168.00400: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867168.00403: Calling groups_plugins_play to load vars for managed_node1 11000 1726867168.01189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867168.02063: done with get_vars() 11000 1726867168.02080: done getting variables 11000 1726867168.02140: done queuing things up, now waiting for results queue to drain 11000 1726867168.02141: results queue empty 11000 1726867168.02142: checking for any_errors_fatal 11000 1726867168.02143: done checking for any_errors_fatal 11000 1726867168.02144: checking for max_fail_percentage 11000 1726867168.02144: done checking for max_fail_percentage 11000 1726867168.02145: checking to see if all hosts have failed and the running result is not ok 11000 1726867168.02145: done checking to see if all hosts have failed 11000 1726867168.02146: getting the remaining hosts for this loop 11000 1726867168.02146: done getting the remaining hosts for this loop 11000 1726867168.02148: getting the next task for host managed_node1 11000 1726867168.02150: done getting next task for host managed_node1 11000 1726867168.02152: ^ task is: TASK: Delete the device '{{ controller_device }}' 11000 1726867168.02153: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867168.02155: getting variables 11000 1726867168.02155: in VariableManager get_vars() 11000 1726867168.02165: Calling all_inventory to load vars for managed_node1 11000 1726867168.02166: Calling groups_inventory to load vars for managed_node1 11000 1726867168.02167: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867168.02170: Calling all_plugins_play to load vars for managed_node1 11000 1726867168.02171: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867168.02173: Calling groups_plugins_play to load vars for managed_node1 11000 1726867168.02855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867168.03697: done with get_vars() 11000 1726867168.03709: done getting variables 11000 1726867168.03737: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11000 1726867168.03823: variable 'controller_device' from source: play vars TASK [Delete the device 'deprecated-bond'] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Friday 20 September 2024 17:19:28 -0400 (0:00:00.376) 0:00:29.681 ****** 11000 1726867168.03845: entering _queue_task() for managed_node1/command 11000 1726867168.04103: worker is 1 (out of 1 available) 11000 1726867168.04117: exiting _queue_task() for managed_node1/command 11000 1726867168.04133: done queuing things up, now waiting for results queue to drain 11000 1726867168.04135: waiting for pending results... 11000 1726867168.04302: running TaskExecutor() for managed_node1/TASK: Delete the device 'deprecated-bond' 11000 1726867168.04379: in run() - task 0affcac9-a3a5-c734-026a-0000000000c2 11000 1726867168.04393: variable 'ansible_search_path' from source: unknown 11000 1726867168.04423: calling self._execute() 11000 1726867168.04504: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.04508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.04516: variable 'omit' from source: magic vars 11000 1726867168.04782: variable 'ansible_distribution_major_version' from source: facts 11000 1726867168.04796: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867168.04800: variable 'omit' from source: magic vars 11000 1726867168.04819: variable 'omit' from source: magic vars 11000 1726867168.04886: variable 'controller_device' from source: play vars 11000 1726867168.04902: variable 'omit' from source: magic vars 11000 1726867168.04934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867168.04961: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867168.04978: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867168.04994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867168.05004: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867168.05029: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867168.05032: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.05035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.05101: Set connection var ansible_shell_type to sh 11000 1726867168.05107: Set connection var ansible_pipelining to False 11000 1726867168.05114: Set connection var ansible_shell_executable to /bin/sh 11000 1726867168.05117: Set connection var ansible_connection to ssh 11000 1726867168.05122: Set connection var ansible_timeout to 10 11000 1726867168.05129: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867168.05150: variable 'ansible_shell_executable' from source: unknown 11000 1726867168.05154: variable 'ansible_connection' from source: unknown 11000 1726867168.05157: variable 'ansible_module_compression' from source: unknown 11000 1726867168.05159: variable 'ansible_shell_type' from source: unknown 11000 1726867168.05162: variable 'ansible_shell_executable' from source: unknown 11000 1726867168.05164: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.05166: variable 'ansible_pipelining' from source: unknown 11000 1726867168.05169: variable 'ansible_timeout' from source: unknown 11000 1726867168.05171: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.05272: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867168.05283: variable 'omit' from source: magic vars 11000 1726867168.05288: starting attempt loop 11000 1726867168.05294: running the handler 11000 1726867168.05306: _low_level_execute_command(): starting 11000 1726867168.05313: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867168.05813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.05817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.05821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.05869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.05872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.05875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.05928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.07562: stdout chunk (state=3): >>>/root <<< 11000 1726867168.07664: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.07689: stderr chunk (state=3): >>><<< 11000 1726867168.07695: stdout chunk (state=3): >>><<< 11000 1726867168.07716: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.07727: _low_level_execute_command(): starting 11000 1726867168.07732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023 `" && echo ansible-tmp-1726867168.0771549-12336-41457971483023="` echo /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023 `" ) && sleep 0' 11000 1726867168.08149: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.08152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.08155: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867168.08164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.08166: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.08211: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.08214: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.08264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.10121: stdout chunk (state=3): >>>ansible-tmp-1726867168.0771549-12336-41457971483023=/root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023 <<< 11000 1726867168.10233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.10253: stderr chunk (state=3): >>><<< 11000 1726867168.10256: stdout chunk (state=3): >>><<< 11000 1726867168.10270: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867168.0771549-12336-41457971483023=/root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.10297: variable 'ansible_module_compression' from source: unknown 11000 1726867168.10336: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867168.10365: variable 'ansible_facts' from source: unknown 11000 1726867168.10420: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/AnsiballZ_command.py 11000 1726867168.10516: Sending initial data 11000 1726867168.10520: Sent initial data (155 bytes) 11000 1726867168.10937: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.10940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867168.10943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867168.10945: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.10947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.10999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.11004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.11046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.12555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11000 1726867168.12560: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867168.12599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867168.12641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpnqbn40uw /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/AnsiballZ_command.py <<< 11000 1726867168.12646: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/AnsiballZ_command.py" <<< 11000 1726867168.12692: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpnqbn40uw" to remote "/root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/AnsiballZ_command.py" <<< 11000 1726867168.13231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.13262: stderr chunk (state=3): >>><<< 11000 1726867168.13266: stdout chunk (state=3): >>><<< 11000 1726867168.13304: done transferring module to remote 11000 1726867168.13312: _low_level_execute_command(): starting 11000 1726867168.13316: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/ /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/AnsiballZ_command.py && sleep 0' 11000 1726867168.13715: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.13718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.13721: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.13723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.13776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.13786: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.13822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.15533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.15553: stderr chunk (state=3): >>><<< 11000 1726867168.15556: stdout chunk (state=3): >>><<< 11000 1726867168.15569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.15572: _low_level_execute_command(): starting 11000 1726867168.15574: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/AnsiballZ_command.py && sleep 0' 11000 1726867168.15948: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.15978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.15983: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867168.15985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.15987: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.15989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.16041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.16048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.16095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.31958: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 17:19:28.310515", "end": "2024-09-20 17:19:28.317765", "delta": "0:00:00.007250", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867168.33500: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.12.57 closed. <<< 11000 1726867168.33503: stdout chunk (state=3): >>><<< 11000 1726867168.33506: stderr chunk (state=3): >>><<< 11000 1726867168.33508: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-20 17:19:28.310515", "end": "2024-09-20 17:19:28.317765", "delta": "0:00:00.007250", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.12.57 closed. 11000 1726867168.33512: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867168.33515: _low_level_execute_command(): starting 11000 1726867168.33517: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867168.0771549-12336-41457971483023/ > /dev/null 2>&1 && sleep 0' 11000 1726867168.34117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867168.34199: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.34202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.34205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.34208: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867168.34215: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867168.34217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.34219: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867168.34222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867168.34224: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867168.34226: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.34228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.34237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.34246: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867168.34255: stderr chunk (state=3): >>>debug2: match found <<< 11000 1726867168.34482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.34486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.34491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.36324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.36372: stderr chunk (state=3): >>><<< 11000 1726867168.36585: stdout chunk (state=3): >>><<< 11000 1726867168.36589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.36593: handler run complete 11000 1726867168.36596: Evaluated conditional (False): False 11000 1726867168.36598: Evaluated conditional (False): False 11000 1726867168.36601: attempt loop complete, returning result 11000 1726867168.36604: _execute() done 11000 1726867168.36607: dumping result to json 11000 1726867168.36609: done dumping result, returning 11000 1726867168.36612: done running TaskExecutor() for managed_node1/TASK: Delete the device 'deprecated-bond' [0affcac9-a3a5-c734-026a-0000000000c2] 11000 1726867168.36614: sending task result for task 0affcac9-a3a5-c734-026a-0000000000c2 11000 1726867168.36692: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000c2 11000 1726867168.36696: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "deprecated-bond" ], "delta": "0:00:00.007250", "end": "2024-09-20 17:19:28.317765", "failed_when_result": false, "rc": 1, "start": "2024-09-20 17:19:28.310515" } STDERR: Cannot find device "deprecated-bond" MSG: non-zero return code 11000 1726867168.36764: no more pending results, returning what we have 11000 1726867168.36767: results queue empty 11000 1726867168.36768: checking for any_errors_fatal 11000 1726867168.36769: done checking for any_errors_fatal 11000 1726867168.36770: checking for max_fail_percentage 11000 1726867168.36882: done checking for max_fail_percentage 11000 1726867168.36885: checking to see if all hosts have failed and the running result is not ok 11000 1726867168.36887: done checking to see if all hosts have failed 11000 1726867168.36889: getting the remaining hosts for this loop 11000 1726867168.36891: done getting the remaining hosts for this loop 11000 1726867168.36894: getting the next task for host managed_node1 11000 1726867168.36900: done getting next task for host managed_node1 11000 1726867168.36903: ^ task is: TASK: Remove test interfaces 11000 1726867168.36906: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867168.36910: getting variables 11000 1726867168.36911: in VariableManager get_vars() 11000 1726867168.36944: Calling all_inventory to load vars for managed_node1 11000 1726867168.36947: Calling groups_inventory to load vars for managed_node1 11000 1726867168.36949: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867168.36958: Calling all_plugins_play to load vars for managed_node1 11000 1726867168.36960: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867168.36963: Calling groups_plugins_play to load vars for managed_node1 11000 1726867168.38415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867168.40211: done with get_vars() 11000 1726867168.40232: done getting variables 11000 1726867168.40302: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 17:19:28 -0400 (0:00:00.364) 0:00:30.046 ****** 11000 1726867168.40339: entering _queue_task() for managed_node1/shell 11000 1726867168.40856: worker is 1 (out of 1 available) 11000 1726867168.40868: exiting _queue_task() for managed_node1/shell 11000 1726867168.41084: done queuing things up, now waiting for results queue to drain 11000 1726867168.41086: waiting for pending results... 11000 1726867168.41215: running TaskExecutor() for managed_node1/TASK: Remove test interfaces 11000 1726867168.41311: in run() - task 0affcac9-a3a5-c734-026a-0000000000c6 11000 1726867168.41387: variable 'ansible_search_path' from source: unknown 11000 1726867168.41391: variable 'ansible_search_path' from source: unknown 11000 1726867168.41393: calling self._execute() 11000 1726867168.41476: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.41492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.41507: variable 'omit' from source: magic vars 11000 1726867168.41991: variable 'ansible_distribution_major_version' from source: facts 11000 1726867168.42008: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867168.42019: variable 'omit' from source: magic vars 11000 1726867168.42187: variable 'omit' from source: magic vars 11000 1726867168.42299: variable 'dhcp_interface1' from source: play vars 11000 1726867168.42309: variable 'dhcp_interface2' from source: play vars 11000 1726867168.42332: variable 'omit' from source: magic vars 11000 1726867168.42373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867168.42416: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867168.42441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867168.42463: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867168.42480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867168.42517: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867168.42525: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.42532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.42631: Set connection var ansible_shell_type to sh 11000 1726867168.42643: Set connection var ansible_pipelining to False 11000 1726867168.42654: Set connection var ansible_shell_executable to /bin/sh 11000 1726867168.42782: Set connection var ansible_connection to ssh 11000 1726867168.42785: Set connection var ansible_timeout to 10 11000 1726867168.42787: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867168.42789: variable 'ansible_shell_executable' from source: unknown 11000 1726867168.42791: variable 'ansible_connection' from source: unknown 11000 1726867168.42793: variable 'ansible_module_compression' from source: unknown 11000 1726867168.42795: variable 'ansible_shell_type' from source: unknown 11000 1726867168.42797: variable 'ansible_shell_executable' from source: unknown 11000 1726867168.42799: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.42801: variable 'ansible_pipelining' from source: unknown 11000 1726867168.42803: variable 'ansible_timeout' from source: unknown 11000 1726867168.42805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.42884: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867168.42901: variable 'omit' from source: magic vars 11000 1726867168.42911: starting attempt loop 11000 1726867168.42921: running the handler 11000 1726867168.42936: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867168.42960: _low_level_execute_command(): starting 11000 1726867168.42973: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867168.43700: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.43742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.43766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.43850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.45497: stdout chunk (state=3): >>>/root <<< 11000 1726867168.45613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.45616: stderr chunk (state=3): >>><<< 11000 1726867168.45622: stdout chunk (state=3): >>><<< 11000 1726867168.45679: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.45683: _low_level_execute_command(): starting 11000 1726867168.45687: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722 `" && echo ansible-tmp-1726867168.4564457-12350-210672170780722="` echo /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722 `" ) && sleep 0' 11000 1726867168.46244: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867168.46254: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.46265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.46441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.46445: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867168.46460: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.46465: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.46468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.46471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.46504: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.48418: stdout chunk (state=3): >>>ansible-tmp-1726867168.4564457-12350-210672170780722=/root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722 <<< 11000 1726867168.48483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.48724: stderr chunk (state=3): >>><<< 11000 1726867168.48728: stdout chunk (state=3): >>><<< 11000 1726867168.48731: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867168.4564457-12350-210672170780722=/root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.48733: variable 'ansible_module_compression' from source: unknown 11000 1726867168.48735: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867168.48916: variable 'ansible_facts' from source: unknown 11000 1726867168.49181: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/AnsiballZ_command.py 11000 1726867168.49405: Sending initial data 11000 1726867168.49415: Sent initial data (156 bytes) 11000 1726867168.49996: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867168.50050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.50130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.50213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.51913: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867168.51953: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867168.52000: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpdhddhiqz /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/AnsiballZ_command.py <<< 11000 1726867168.52007: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/AnsiballZ_command.py" <<< 11000 1726867168.52053: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpdhddhiqz" to remote "/root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/AnsiballZ_command.py" <<< 11000 1726867168.52747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.52983: stderr chunk (state=3): >>><<< 11000 1726867168.52986: stdout chunk (state=3): >>><<< 11000 1726867168.52991: done transferring module to remote 11000 1726867168.52994: _low_level_execute_command(): starting 11000 1726867168.52996: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/ /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/AnsiballZ_command.py && sleep 0' 11000 1726867168.53358: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867168.53367: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.53380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.53393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.53406: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867168.53413: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867168.53422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.53439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867168.53442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867168.53449: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867168.53496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.53543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.53553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.53581: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.53643: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.55542: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.55546: stdout chunk (state=3): >>><<< 11000 1726867168.55548: stderr chunk (state=3): >>><<< 11000 1726867168.55562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.55640: _low_level_execute_command(): starting 11000 1726867168.55644: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/AnsiballZ_command.py && sleep 0' 11000 1726867168.56142: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867168.56156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.56170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.56191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.56293: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.56315: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.56398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.75587: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:19:28.712746", "end": "2024-09-20 17:19:28.753977", "delta": "0:00:00.041231", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867168.77231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867168.77242: stdout chunk (state=3): >>><<< 11000 1726867168.77256: stderr chunk (state=3): >>><<< 11000 1726867168.77281: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 17:19:28.712746", "end": "2024-09-20 17:19:28.753977", "delta": "0:00:00.041231", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867168.77612: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867168.77616: _low_level_execute_command(): starting 11000 1726867168.77619: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867168.4564457-12350-210672170780722/ > /dev/null 2>&1 && sleep 0' 11000 1726867168.78698: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867168.78718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.78733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.78823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.78897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.78902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.78938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.80908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.80911: stdout chunk (state=3): >>><<< 11000 1726867168.80916: stderr chunk (state=3): >>><<< 11000 1726867168.80919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.80922: handler run complete 11000 1726867168.80924: Evaluated conditional (False): False 11000 1726867168.80926: attempt loop complete, returning result 11000 1726867168.80928: _execute() done 11000 1726867168.80930: dumping result to json 11000 1726867168.80932: done dumping result, returning 11000 1726867168.80939: done running TaskExecutor() for managed_node1/TASK: Remove test interfaces [0affcac9-a3a5-c734-026a-0000000000c6] 11000 1726867168.80944: sending task result for task 0affcac9-a3a5-c734-026a-0000000000c6 ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.041231", "end": "2024-09-20 17:19:28.753977", "rc": 0, "start": "2024-09-20 17:19:28.712746" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11000 1726867168.81257: no more pending results, returning what we have 11000 1726867168.81260: results queue empty 11000 1726867168.81261: checking for any_errors_fatal 11000 1726867168.81274: done checking for any_errors_fatal 11000 1726867168.81275: checking for max_fail_percentage 11000 1726867168.81279: done checking for max_fail_percentage 11000 1726867168.81280: checking to see if all hosts have failed and the running result is not ok 11000 1726867168.81281: done checking to see if all hosts have failed 11000 1726867168.81282: getting the remaining hosts for this loop 11000 1726867168.81283: done getting the remaining hosts for this loop 11000 1726867168.81287: getting the next task for host managed_node1 11000 1726867168.81297: done getting next task for host managed_node1 11000 1726867168.81300: ^ task is: TASK: Stop dnsmasq/radvd services 11000 1726867168.81305: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867168.81310: getting variables 11000 1726867168.81312: in VariableManager get_vars() 11000 1726867168.81355: Calling all_inventory to load vars for managed_node1 11000 1726867168.81359: Calling groups_inventory to load vars for managed_node1 11000 1726867168.81362: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867168.81374: Calling all_plugins_play to load vars for managed_node1 11000 1726867168.81419: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867168.81425: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000c6 11000 1726867168.81428: WORKER PROCESS EXITING 11000 1726867168.81438: Calling groups_plugins_play to load vars for managed_node1 11000 1726867168.83237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867168.84693: done with get_vars() 11000 1726867168.84708: done getting variables 11000 1726867168.84749: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 17:19:28 -0400 (0:00:00.444) 0:00:30.490 ****** 11000 1726867168.84773: entering _queue_task() for managed_node1/shell 11000 1726867168.85003: worker is 1 (out of 1 available) 11000 1726867168.85017: exiting _queue_task() for managed_node1/shell 11000 1726867168.85030: done queuing things up, now waiting for results queue to drain 11000 1726867168.85031: waiting for pending results... 11000 1726867168.85287: running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services 11000 1726867168.85424: in run() - task 0affcac9-a3a5-c734-026a-0000000000c7 11000 1726867168.85517: variable 'ansible_search_path' from source: unknown 11000 1726867168.85520: variable 'ansible_search_path' from source: unknown 11000 1726867168.85523: calling self._execute() 11000 1726867168.85596: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.85608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.85625: variable 'omit' from source: magic vars 11000 1726867168.86048: variable 'ansible_distribution_major_version' from source: facts 11000 1726867168.86062: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867168.86070: variable 'omit' from source: magic vars 11000 1726867168.86119: variable 'omit' from source: magic vars 11000 1726867168.86208: variable 'omit' from source: magic vars 11000 1726867168.86212: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867168.86238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867168.86268: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867168.86272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867168.86300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867168.86311: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867168.86315: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.86419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.86423: Set connection var ansible_shell_type to sh 11000 1726867168.86425: Set connection var ansible_pipelining to False 11000 1726867168.86428: Set connection var ansible_shell_executable to /bin/sh 11000 1726867168.86430: Set connection var ansible_connection to ssh 11000 1726867168.86442: Set connection var ansible_timeout to 10 11000 1726867168.86445: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867168.86682: variable 'ansible_shell_executable' from source: unknown 11000 1726867168.86686: variable 'ansible_connection' from source: unknown 11000 1726867168.86691: variable 'ansible_module_compression' from source: unknown 11000 1726867168.86693: variable 'ansible_shell_type' from source: unknown 11000 1726867168.86696: variable 'ansible_shell_executable' from source: unknown 11000 1726867168.86698: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867168.86700: variable 'ansible_pipelining' from source: unknown 11000 1726867168.86702: variable 'ansible_timeout' from source: unknown 11000 1726867168.86704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867168.86707: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867168.86709: variable 'omit' from source: magic vars 11000 1726867168.86711: starting attempt loop 11000 1726867168.86713: running the handler 11000 1726867168.86716: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867168.86718: _low_level_execute_command(): starting 11000 1726867168.86721: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867168.87303: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.87352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.87392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.87406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.87450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.89080: stdout chunk (state=3): >>>/root <<< 11000 1726867168.89181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.89205: stderr chunk (state=3): >>><<< 11000 1726867168.89209: stdout chunk (state=3): >>><<< 11000 1726867168.89231: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.89242: _low_level_execute_command(): starting 11000 1726867168.89247: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450 `" && echo ansible-tmp-1726867168.8922973-12384-235282631624450="` echo /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450 `" ) && sleep 0' 11000 1726867168.89794: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.89829: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867168.89844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.89857: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.89954: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.91807: stdout chunk (state=3): >>>ansible-tmp-1726867168.8922973-12384-235282631624450=/root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450 <<< 11000 1726867168.91912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.91937: stderr chunk (state=3): >>><<< 11000 1726867168.91939: stdout chunk (state=3): >>><<< 11000 1726867168.91970: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867168.8922973-12384-235282631624450=/root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.91981: variable 'ansible_module_compression' from source: unknown 11000 1726867168.92043: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867168.92085: variable 'ansible_facts' from source: unknown 11000 1726867168.92176: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/AnsiballZ_command.py 11000 1726867168.92389: Sending initial data 11000 1726867168.92393: Sent initial data (156 bytes) 11000 1726867168.92821: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867168.92828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.92868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.92918: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.92953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.93031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.94540: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11000 1726867168.94549: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867168.94582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867168.94625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmp141605lg /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/AnsiballZ_command.py <<< 11000 1726867168.94631: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/AnsiballZ_command.py" <<< 11000 1726867168.94670: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmp141605lg" to remote "/root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/AnsiballZ_command.py" <<< 11000 1726867168.94676: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/AnsiballZ_command.py" <<< 11000 1726867168.95216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.95248: stderr chunk (state=3): >>><<< 11000 1726867168.95251: stdout chunk (state=3): >>><<< 11000 1726867168.95289: done transferring module to remote 11000 1726867168.95309: _low_level_execute_command(): starting 11000 1726867168.95312: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/ /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/AnsiballZ_command.py && sleep 0' 11000 1726867168.95906: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867168.95909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found <<< 11000 1726867168.95912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.95914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867168.95980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.96021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.96079: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.96152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867168.97898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867168.97920: stderr chunk (state=3): >>><<< 11000 1726867168.97923: stdout chunk (state=3): >>><<< 11000 1726867168.97936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867168.97939: _low_level_execute_command(): starting 11000 1726867168.97947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/AnsiballZ_command.py && sleep 0' 11000 1726867168.98339: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867168.98342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.98344: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration <<< 11000 1726867168.98346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867168.98348: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867168.98405: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867168.98408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867168.98453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.16487: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:19:29.133410", "end": "2024-09-20 17:19:29.159800", "delta": "0:00:00.026390", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867169.17717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.17721: stderr chunk (state=3): >>>Shared connection to 10.31.12.57 closed. <<< 11000 1726867169.17768: stderr chunk (state=3): >>><<< 11000 1726867169.17772: stdout chunk (state=3): >>><<< 11000 1726867169.17806: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 17:19:29.133410", "end": "2024-09-20 17:19:29.159800", "delta": "0:00:00.026390", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867169.17848: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867169.17983: _low_level_execute_command(): starting 11000 1726867169.17987: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867168.8922973-12384-235282631624450/ > /dev/null 2>&1 && sleep 0' 11000 1726867169.18488: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.18501: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.18525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.18591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.18634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.18644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.18663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.18728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.20569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.20573: stdout chunk (state=3): >>><<< 11000 1726867169.20580: stderr chunk (state=3): >>><<< 11000 1726867169.20782: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.20785: handler run complete 11000 1726867169.20787: Evaluated conditional (False): False 11000 1726867169.20789: attempt loop complete, returning result 11000 1726867169.20791: _execute() done 11000 1726867169.20792: dumping result to json 11000 1726867169.20794: done dumping result, returning 11000 1726867169.20795: done running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services [0affcac9-a3a5-c734-026a-0000000000c7] 11000 1726867169.20797: sending task result for task 0affcac9-a3a5-c734-026a-0000000000c7 11000 1726867169.20863: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000c7 11000 1726867169.20867: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.026390", "end": "2024-09-20 17:19:29.159800", "rc": 0, "start": "2024-09-20 17:19:29.133410" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11000 1726867169.21042: no more pending results, returning what we have 11000 1726867169.21045: results queue empty 11000 1726867169.21046: checking for any_errors_fatal 11000 1726867169.21056: done checking for any_errors_fatal 11000 1726867169.21057: checking for max_fail_percentage 11000 1726867169.21058: done checking for max_fail_percentage 11000 1726867169.21059: checking to see if all hosts have failed and the running result is not ok 11000 1726867169.21060: done checking to see if all hosts have failed 11000 1726867169.21061: getting the remaining hosts for this loop 11000 1726867169.21063: done getting the remaining hosts for this loop 11000 1726867169.21066: getting the next task for host managed_node1 11000 1726867169.21075: done getting next task for host managed_node1 11000 1726867169.21243: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11000 1726867169.21247: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867169.21252: getting variables 11000 1726867169.21253: in VariableManager get_vars() 11000 1726867169.21297: Calling all_inventory to load vars for managed_node1 11000 1726867169.21300: Calling groups_inventory to load vars for managed_node1 11000 1726867169.21302: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867169.21312: Calling all_plugins_play to load vars for managed_node1 11000 1726867169.21315: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867169.21323: Calling groups_plugins_play to load vars for managed_node1 11000 1726867169.23491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867169.25102: done with get_vars() 11000 1726867169.25123: done getting variables 11000 1726867169.25183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:131 Friday 20 September 2024 17:19:29 -0400 (0:00:00.404) 0:00:30.895 ****** 11000 1726867169.25213: entering _queue_task() for managed_node1/command 11000 1726867169.25609: worker is 1 (out of 1 available) 11000 1726867169.25620: exiting _queue_task() for managed_node1/command 11000 1726867169.25632: done queuing things up, now waiting for results queue to drain 11000 1726867169.25635: waiting for pending results... 11000 1726867169.25818: running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript 11000 1726867169.25929: in run() - task 0affcac9-a3a5-c734-026a-0000000000c8 11000 1726867169.25971: variable 'ansible_search_path' from source: unknown 11000 1726867169.26004: calling self._execute() 11000 1726867169.26284: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.26288: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.26291: variable 'omit' from source: magic vars 11000 1726867169.26691: variable 'ansible_distribution_major_version' from source: facts 11000 1726867169.26797: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867169.27036: variable 'network_provider' from source: set_fact 11000 1726867169.27075: Evaluated conditional (network_provider == "initscripts"): False 11000 1726867169.27085: when evaluation is False, skipping this task 11000 1726867169.27385: _execute() done 11000 1726867169.27389: dumping result to json 11000 1726867169.27391: done dumping result, returning 11000 1726867169.27394: done running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript [0affcac9-a3a5-c734-026a-0000000000c8] 11000 1726867169.27397: sending task result for task 0affcac9-a3a5-c734-026a-0000000000c8 11000 1726867169.27474: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000c8 11000 1726867169.27479: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11000 1726867169.27535: no more pending results, returning what we have 11000 1726867169.27538: results queue empty 11000 1726867169.27539: checking for any_errors_fatal 11000 1726867169.27548: done checking for any_errors_fatal 11000 1726867169.27548: checking for max_fail_percentage 11000 1726867169.27551: done checking for max_fail_percentage 11000 1726867169.27552: checking to see if all hosts have failed and the running result is not ok 11000 1726867169.27553: done checking to see if all hosts have failed 11000 1726867169.27553: getting the remaining hosts for this loop 11000 1726867169.27555: done getting the remaining hosts for this loop 11000 1726867169.27558: getting the next task for host managed_node1 11000 1726867169.27566: done getting next task for host managed_node1 11000 1726867169.27569: ^ task is: TASK: Verify network state restored to default 11000 1726867169.27573: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867169.27580: getting variables 11000 1726867169.27582: in VariableManager get_vars() 11000 1726867169.27621: Calling all_inventory to load vars for managed_node1 11000 1726867169.27624: Calling groups_inventory to load vars for managed_node1 11000 1726867169.27626: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867169.27640: Calling all_plugins_play to load vars for managed_node1 11000 1726867169.27643: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867169.27645: Calling groups_plugins_play to load vars for managed_node1 11000 1726867169.29511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867169.31245: done with get_vars() 11000 1726867169.31264: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:136 Friday 20 September 2024 17:19:29 -0400 (0:00:00.061) 0:00:30.956 ****** 11000 1726867169.31355: entering _queue_task() for managed_node1/include_tasks 11000 1726867169.31632: worker is 1 (out of 1 available) 11000 1726867169.31645: exiting _queue_task() for managed_node1/include_tasks 11000 1726867169.31657: done queuing things up, now waiting for results queue to drain 11000 1726867169.31658: waiting for pending results... 11000 1726867169.31934: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 11000 1726867169.32057: in run() - task 0affcac9-a3a5-c734-026a-0000000000c9 11000 1726867169.32076: variable 'ansible_search_path' from source: unknown 11000 1726867169.32124: calling self._execute() 11000 1726867169.32263: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.32275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.32294: variable 'omit' from source: magic vars 11000 1726867169.32687: variable 'ansible_distribution_major_version' from source: facts 11000 1726867169.32708: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867169.32719: _execute() done 11000 1726867169.32728: dumping result to json 11000 1726867169.32736: done dumping result, returning 11000 1726867169.32757: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [0affcac9-a3a5-c734-026a-0000000000c9] 11000 1726867169.32760: sending task result for task 0affcac9-a3a5-c734-026a-0000000000c9 11000 1726867169.32952: done sending task result for task 0affcac9-a3a5-c734-026a-0000000000c9 11000 1726867169.32955: WORKER PROCESS EXITING 11000 1726867169.32984: no more pending results, returning what we have 11000 1726867169.32991: in VariableManager get_vars() 11000 1726867169.33038: Calling all_inventory to load vars for managed_node1 11000 1726867169.33041: Calling groups_inventory to load vars for managed_node1 11000 1726867169.33044: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867169.33058: Calling all_plugins_play to load vars for managed_node1 11000 1726867169.33061: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867169.33064: Calling groups_plugins_play to load vars for managed_node1 11000 1726867169.34683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867169.36517: done with get_vars() 11000 1726867169.36534: variable 'ansible_search_path' from source: unknown 11000 1726867169.36547: we have included files to process 11000 1726867169.36548: generating all_blocks data 11000 1726867169.36551: done generating all_blocks data 11000 1726867169.36556: processing included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11000 1726867169.36557: loading included file: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11000 1726867169.36560: Loading data from /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11000 1726867169.36959: done processing included file 11000 1726867169.36962: iterating over new_blocks loaded from include file 11000 1726867169.36963: in VariableManager get_vars() 11000 1726867169.36984: done with get_vars() 11000 1726867169.36985: filtering new block on tags 11000 1726867169.37022: done filtering new block on tags 11000 1726867169.37024: done iterating over new_blocks loaded from include file included: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 11000 1726867169.37029: extending task lists for all hosts with included blocks 11000 1726867169.38280: done extending task lists 11000 1726867169.38282: done processing included files 11000 1726867169.38283: results queue empty 11000 1726867169.38283: checking for any_errors_fatal 11000 1726867169.38286: done checking for any_errors_fatal 11000 1726867169.38287: checking for max_fail_percentage 11000 1726867169.38290: done checking for max_fail_percentage 11000 1726867169.38291: checking to see if all hosts have failed and the running result is not ok 11000 1726867169.38292: done checking to see if all hosts have failed 11000 1726867169.38293: getting the remaining hosts for this loop 11000 1726867169.38294: done getting the remaining hosts for this loop 11000 1726867169.38296: getting the next task for host managed_node1 11000 1726867169.38301: done getting next task for host managed_node1 11000 1726867169.38303: ^ task is: TASK: Check routes and DNS 11000 1726867169.38305: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867169.38307: getting variables 11000 1726867169.38308: in VariableManager get_vars() 11000 1726867169.38321: Calling all_inventory to load vars for managed_node1 11000 1726867169.38323: Calling groups_inventory to load vars for managed_node1 11000 1726867169.38325: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867169.38331: Calling all_plugins_play to load vars for managed_node1 11000 1726867169.38333: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867169.38336: Calling groups_plugins_play to load vars for managed_node1 11000 1726867169.39491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867169.41102: done with get_vars() 11000 1726867169.41123: done getting variables 11000 1726867169.41163: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 17:19:29 -0400 (0:00:00.098) 0:00:31.055 ****** 11000 1726867169.41196: entering _queue_task() for managed_node1/shell 11000 1726867169.41624: worker is 1 (out of 1 available) 11000 1726867169.41636: exiting _queue_task() for managed_node1/shell 11000 1726867169.41647: done queuing things up, now waiting for results queue to drain 11000 1726867169.41648: waiting for pending results... 11000 1726867169.41891: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 11000 1726867169.41985: in run() - task 0affcac9-a3a5-c734-026a-000000000570 11000 1726867169.42184: variable 'ansible_search_path' from source: unknown 11000 1726867169.42191: variable 'ansible_search_path' from source: unknown 11000 1726867169.42194: calling self._execute() 11000 1726867169.42196: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.42199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.42201: variable 'omit' from source: magic vars 11000 1726867169.42525: variable 'ansible_distribution_major_version' from source: facts 11000 1726867169.42544: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867169.42553: variable 'omit' from source: magic vars 11000 1726867169.42601: variable 'omit' from source: magic vars 11000 1726867169.42640: variable 'omit' from source: magic vars 11000 1726867169.42679: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867169.42719: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867169.42742: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867169.42984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867169.42988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867169.42993: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867169.42996: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.42999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.43002: Set connection var ansible_shell_type to sh 11000 1726867169.43004: Set connection var ansible_pipelining to False 11000 1726867169.43006: Set connection var ansible_shell_executable to /bin/sh 11000 1726867169.43009: Set connection var ansible_connection to ssh 11000 1726867169.43011: Set connection var ansible_timeout to 10 11000 1726867169.43012: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867169.43014: variable 'ansible_shell_executable' from source: unknown 11000 1726867169.43017: variable 'ansible_connection' from source: unknown 11000 1726867169.43019: variable 'ansible_module_compression' from source: unknown 11000 1726867169.43021: variable 'ansible_shell_type' from source: unknown 11000 1726867169.43023: variable 'ansible_shell_executable' from source: unknown 11000 1726867169.43025: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.43027: variable 'ansible_pipelining' from source: unknown 11000 1726867169.43030: variable 'ansible_timeout' from source: unknown 11000 1726867169.43032: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.43171: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867169.43194: variable 'omit' from source: magic vars 11000 1726867169.43205: starting attempt loop 11000 1726867169.43212: running the handler 11000 1726867169.43228: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867169.43255: _low_level_execute_command(): starting 11000 1726867169.43269: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867169.44005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.44105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.44145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.44163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.44191: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.44328: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.45999: stdout chunk (state=3): >>>/root <<< 11000 1726867169.46137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.46142: stderr chunk (state=3): >>><<< 11000 1726867169.46145: stdout chunk (state=3): >>><<< 11000 1726867169.46384: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.46387: _low_level_execute_command(): starting 11000 1726867169.46394: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123 `" && echo ansible-tmp-1726867169.4616857-12415-263364116107123="` echo /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123 `" ) && sleep 0' 11000 1726867169.46760: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.46770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.46788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.46803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867169.46816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867169.46823: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867169.46833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.46850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867169.46857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867169.46864: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867169.46872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.46885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.46899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867169.46904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867169.46912: stderr chunk (state=3): >>>debug2: match found <<< 11000 1726867169.46922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.46987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.46999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.47017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.47106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.49004: stdout chunk (state=3): >>>ansible-tmp-1726867169.4616857-12415-263364116107123=/root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123 <<< 11000 1726867169.49125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.49142: stderr chunk (state=3): >>><<< 11000 1726867169.49283: stdout chunk (state=3): >>><<< 11000 1726867169.49287: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867169.4616857-12415-263364116107123=/root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.49293: variable 'ansible_module_compression' from source: unknown 11000 1726867169.49296: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867169.49303: variable 'ansible_facts' from source: unknown 11000 1726867169.49398: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/AnsiballZ_command.py 11000 1726867169.49616: Sending initial data 11000 1726867169.49629: Sent initial data (156 bytes) 11000 1726867169.50139: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.50154: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.50170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.50275: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.50313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.50382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.51921: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867169.52016: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867169.52019: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/AnsiballZ_command.py" <<< 11000 1726867169.52022: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmpefzrl6ab /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/AnsiballZ_command.py <<< 11000 1726867169.52128: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmpefzrl6ab" to remote "/root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/AnsiballZ_command.py" <<< 11000 1726867169.52854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.52894: stderr chunk (state=3): >>><<< 11000 1726867169.52897: stdout chunk (state=3): >>><<< 11000 1726867169.52908: done transferring module to remote 11000 1726867169.52916: _low_level_execute_command(): starting 11000 1726867169.52921: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/ /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/AnsiballZ_command.py && sleep 0' 11000 1726867169.53326: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.53333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.53336: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found <<< 11000 1726867169.53339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.53375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.53384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.53439: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.55206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.55209: stderr chunk (state=3): >>><<< 11000 1726867169.55212: stdout chunk (state=3): >>><<< 11000 1726867169.55244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.55247: _low_level_execute_command(): starting 11000 1726867169.55250: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/AnsiballZ_command.py && sleep 0' 11000 1726867169.55803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.55817: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.55900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.55937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.55951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.55972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.56057: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.72096: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3201sec preferred_lft 3201sec\n inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:19:29.709936", "end": "2024-09-20 17:19:29.718426", "delta": "0:00:00.008490", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867169.73709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867169.73714: stdout chunk (state=3): >>><<< 11000 1726867169.73716: stderr chunk (state=3): >>><<< 11000 1726867169.74084: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3201sec preferred_lft 3201sec\n inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 17:19:29.709936", "end": "2024-09-20 17:19:29.718426", "delta": "0:00:00.008490", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867169.74097: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867169.74100: _low_level_execute_command(): starting 11000 1726867169.74103: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867169.4616857-12415-263364116107123/ > /dev/null 2>&1 && sleep 0' 11000 1726867169.74871: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.74882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.74894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.74906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867169.74918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867169.74925: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867169.74934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.74948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867169.74955: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867169.74961: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867169.74969: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.74980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.75055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.75071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.75294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.77086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.77092: stdout chunk (state=3): >>><<< 11000 1726867169.77095: stderr chunk (state=3): >>><<< 11000 1726867169.77097: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.77100: handler run complete 11000 1726867169.77102: Evaluated conditional (False): False 11000 1726867169.77104: attempt loop complete, returning result 11000 1726867169.77105: _execute() done 11000 1726867169.77107: dumping result to json 11000 1726867169.77108: done dumping result, returning 11000 1726867169.77110: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [0affcac9-a3a5-c734-026a-000000000570] 11000 1726867169.77111: sending task result for task 0affcac9-a3a5-c734-026a-000000000570 11000 1726867169.77175: done sending task result for task 0affcac9-a3a5-c734-026a-000000000570 11000 1726867169.77181: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008490", "end": "2024-09-20 17:19:29.718426", "rc": 0, "start": "2024-09-20 17:19:29.709936" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:fe:d3:7d:4f brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.12.57/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3201sec preferred_lft 3201sec inet6 fe80::8ff:feff:fed3:7d4f/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.12.57 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.12.57 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11000 1726867169.77256: no more pending results, returning what we have 11000 1726867169.77260: results queue empty 11000 1726867169.77260: checking for any_errors_fatal 11000 1726867169.77262: done checking for any_errors_fatal 11000 1726867169.77262: checking for max_fail_percentage 11000 1726867169.77263: done checking for max_fail_percentage 11000 1726867169.77264: checking to see if all hosts have failed and the running result is not ok 11000 1726867169.77265: done checking to see if all hosts have failed 11000 1726867169.77266: getting the remaining hosts for this loop 11000 1726867169.77268: done getting the remaining hosts for this loop 11000 1726867169.77271: getting the next task for host managed_node1 11000 1726867169.77279: done getting next task for host managed_node1 11000 1726867169.77281: ^ task is: TASK: Verify DNS and network connectivity 11000 1726867169.77284: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11000 1726867169.77294: getting variables 11000 1726867169.77296: in VariableManager get_vars() 11000 1726867169.77453: Calling all_inventory to load vars for managed_node1 11000 1726867169.77456: Calling groups_inventory to load vars for managed_node1 11000 1726867169.77458: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867169.77467: Calling all_plugins_play to load vars for managed_node1 11000 1726867169.77470: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867169.77472: Calling groups_plugins_play to load vars for managed_node1 11000 1726867169.79045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867169.80980: done with get_vars() 11000 1726867169.81011: done getting variables 11000 1726867169.81075: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 17:19:29 -0400 (0:00:00.399) 0:00:31.454 ****** 11000 1726867169.81110: entering _queue_task() for managed_node1/shell 11000 1726867169.81461: worker is 1 (out of 1 available) 11000 1726867169.81473: exiting _queue_task() for managed_node1/shell 11000 1726867169.81492: done queuing things up, now waiting for results queue to drain 11000 1726867169.81493: waiting for pending results... 11000 1726867169.81783: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 11000 1726867169.81918: in run() - task 0affcac9-a3a5-c734-026a-000000000571 11000 1726867169.81945: variable 'ansible_search_path' from source: unknown 11000 1726867169.81955: variable 'ansible_search_path' from source: unknown 11000 1726867169.82053: calling self._execute() 11000 1726867169.82121: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.82136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.82152: variable 'omit' from source: magic vars 11000 1726867169.82561: variable 'ansible_distribution_major_version' from source: facts 11000 1726867169.82582: Evaluated conditional (ansible_distribution_major_version != '6'): True 11000 1726867169.82732: variable 'ansible_facts' from source: unknown 11000 1726867169.83506: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11000 1726867169.83568: variable 'omit' from source: magic vars 11000 1726867169.83572: variable 'omit' from source: magic vars 11000 1726867169.83607: variable 'omit' from source: magic vars 11000 1726867169.83649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11000 1726867169.83698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11000 1726867169.83724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11000 1726867169.83744: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867169.83759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11000 1726867169.83799: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11000 1726867169.83807: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.83814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.84082: Set connection var ansible_shell_type to sh 11000 1726867169.84085: Set connection var ansible_pipelining to False 11000 1726867169.84087: Set connection var ansible_shell_executable to /bin/sh 11000 1726867169.84093: Set connection var ansible_connection to ssh 11000 1726867169.84096: Set connection var ansible_timeout to 10 11000 1726867169.84098: Set connection var ansible_module_compression to ZIP_DEFLATED 11000 1726867169.84101: variable 'ansible_shell_executable' from source: unknown 11000 1726867169.84104: variable 'ansible_connection' from source: unknown 11000 1726867169.84107: variable 'ansible_module_compression' from source: unknown 11000 1726867169.84109: variable 'ansible_shell_type' from source: unknown 11000 1726867169.84112: variable 'ansible_shell_executable' from source: unknown 11000 1726867169.84114: variable 'ansible_host' from source: host vars for 'managed_node1' 11000 1726867169.84116: variable 'ansible_pipelining' from source: unknown 11000 1726867169.84119: variable 'ansible_timeout' from source: unknown 11000 1726867169.84122: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11000 1726867169.84184: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867169.84203: variable 'omit' from source: magic vars 11000 1726867169.84212: starting attempt loop 11000 1726867169.84218: running the handler 11000 1726867169.84236: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11000 1726867169.84260: _low_level_execute_command(): starting 11000 1726867169.84271: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11000 1726867169.85101: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.85121: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.85232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.85327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.86976: stdout chunk (state=3): >>>/root <<< 11000 1726867169.87106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.87109: stderr chunk (state=3): >>><<< 11000 1726867169.87282: stdout chunk (state=3): >>><<< 11000 1726867169.87286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.87293: _low_level_execute_command(): starting 11000 1726867169.87297: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550 `" && echo ansible-tmp-1726867169.8713806-12442-270893936932550="` echo /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550 `" ) && sleep 0' 11000 1726867169.87738: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.87744: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.87753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.87767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867169.87781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867169.87792: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867169.87799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.87814: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11000 1726867169.87822: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.12.57 is address <<< 11000 1726867169.87829: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11000 1726867169.87837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.87951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.87954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867169.87964: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867169.87966: stderr chunk (state=3): >>>debug2: match found <<< 11000 1726867169.87968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.87970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.87972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.87996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.88073: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.89956: stdout chunk (state=3): >>>ansible-tmp-1726867169.8713806-12442-270893936932550=/root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550 <<< 11000 1726867169.90100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.90109: stdout chunk (state=3): >>><<< 11000 1726867169.90119: stderr chunk (state=3): >>><<< 11000 1726867169.90139: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726867169.8713806-12442-270893936932550=/root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.90282: variable 'ansible_module_compression' from source: unknown 11000 1726867169.90285: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-110001ou6sey_/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11000 1726867169.90287: variable 'ansible_facts' from source: unknown 11000 1726867169.90367: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/AnsiballZ_command.py 11000 1726867169.90530: Sending initial data 11000 1726867169.90538: Sent initial data (156 bytes) 11000 1726867169.91138: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.91151: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.91175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.91287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.91331: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.91369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.92924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11000 1726867169.92975: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11000 1726867169.93082: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-110001ou6sey_/tmprl2_w7rk /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/AnsiballZ_command.py <<< 11000 1726867169.93086: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/AnsiballZ_command.py" <<< 11000 1726867169.93298: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-110001ou6sey_/tmprl2_w7rk" to remote "/root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/AnsiballZ_command.py" <<< 11000 1726867169.93991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.94055: stderr chunk (state=3): >>><<< 11000 1726867169.94058: stdout chunk (state=3): >>><<< 11000 1726867169.94123: done transferring module to remote 11000 1726867169.94134: _low_level_execute_command(): starting 11000 1726867169.94282: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/ /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/AnsiballZ_command.py && sleep 0' 11000 1726867169.94768: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.94787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.94798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.94812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867169.94895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.94924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.94935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.94957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.95026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867169.96953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867169.96957: stdout chunk (state=3): >>><<< 11000 1726867169.96959: stderr chunk (state=3): >>><<< 11000 1726867169.96962: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867169.96964: _low_level_execute_command(): starting 11000 1726867169.96967: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/AnsiballZ_command.py && sleep 0' 11000 1726867169.97522: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11000 1726867169.97537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11000 1726867169.97551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867169.97571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11000 1726867169.97601: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 <<< 11000 1726867169.97614: stderr chunk (state=3): >>>debug2: match not found <<< 11000 1726867169.97696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867169.97721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867169.97740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11000 1726867169.97761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867169.97848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867170.39018: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1400 0 --:--:-- --:--:-- --:--:-- 1405\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15002 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:19:30.128879", "end": "2024-09-20 17:19:30.388285", "delta": "0:00:00.259406", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11000 1726867170.40769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. <<< 11000 1726867170.40793: stderr chunk (state=3): >>><<< 11000 1726867170.40796: stdout chunk (state=3): >>><<< 11000 1726867170.40821: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1400 0 --:--:-- --:--:-- --:--:-- 1405\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 15002 0 --:--:-- --:--:-- --:--:-- 15315", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 17:19:30.128879", "end": "2024-09-20 17:19:30.388285", "delta": "0:00:00.259406", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.12.57 closed. 11000 1726867170.40852: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11000 1726867170.40859: _low_level_execute_command(): starting 11000 1726867170.40864: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726867169.8713806-12442-270893936932550/ > /dev/null 2>&1 && sleep 0' 11000 1726867170.41283: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11000 1726867170.41287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867170.41301: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11000 1726867170.41352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' <<< 11000 1726867170.41357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11000 1726867170.41408: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11000 1726867170.43284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11000 1726867170.43313: stderr chunk (state=3): >>><<< 11000 1726867170.43318: stdout chunk (state=3): >>><<< 11000 1726867170.43386: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.12.57 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.12.57 originally 10.31.12.57 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/ac0999e354' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11000 1726867170.43397: handler run complete 11000 1726867170.43400: Evaluated conditional (False): False 11000 1726867170.43402: attempt loop complete, returning result 11000 1726867170.43432: _execute() done 11000 1726867170.43487: dumping result to json 11000 1726867170.43494: done dumping result, returning 11000 1726867170.43507: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [0affcac9-a3a5-c734-026a-000000000571] 11000 1726867170.43526: sending task result for task 0affcac9-a3a5-c734-026a-000000000571 11000 1726867170.43627: done sending task result for task 0affcac9-a3a5-c734-026a-000000000571 11000 1726867170.43630: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.259406", "end": "2024-09-20 17:19:30.388285", "rc": 0, "start": "2024-09-20 17:19:30.128879" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1400 0 --:--:-- --:--:-- --:--:-- 1405 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 15002 0 --:--:-- --:--:-- --:--:-- 15315 11000 1726867170.43731: no more pending results, returning what we have 11000 1726867170.43734: results queue empty 11000 1726867170.43734: checking for any_errors_fatal 11000 1726867170.43745: done checking for any_errors_fatal 11000 1726867170.43746: checking for max_fail_percentage 11000 1726867170.43748: done checking for max_fail_percentage 11000 1726867170.43749: checking to see if all hosts have failed and the running result is not ok 11000 1726867170.43750: done checking to see if all hosts have failed 11000 1726867170.43750: getting the remaining hosts for this loop 11000 1726867170.43752: done getting the remaining hosts for this loop 11000 1726867170.43755: getting the next task for host managed_node1 11000 1726867170.43765: done getting next task for host managed_node1 11000 1726867170.43767: ^ task is: TASK: meta (flush_handlers) 11000 1726867170.43769: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867170.43772: getting variables 11000 1726867170.43774: in VariableManager get_vars() 11000 1726867170.43820: Calling all_inventory to load vars for managed_node1 11000 1726867170.43823: Calling groups_inventory to load vars for managed_node1 11000 1726867170.43825: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867170.43834: Calling all_plugins_play to load vars for managed_node1 11000 1726867170.43836: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867170.43838: Calling groups_plugins_play to load vars for managed_node1 11000 1726867170.44772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867170.45886: done with get_vars() 11000 1726867170.45908: done getting variables 11000 1726867170.45975: in VariableManager get_vars() 11000 1726867170.45994: Calling all_inventory to load vars for managed_node1 11000 1726867170.45997: Calling groups_inventory to load vars for managed_node1 11000 1726867170.45999: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867170.46004: Calling all_plugins_play to load vars for managed_node1 11000 1726867170.46006: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867170.46009: Calling groups_plugins_play to load vars for managed_node1 11000 1726867170.46736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867170.47593: done with get_vars() 11000 1726867170.47610: done queuing things up, now waiting for results queue to drain 11000 1726867170.47612: results queue empty 11000 1726867170.47613: checking for any_errors_fatal 11000 1726867170.47615: done checking for any_errors_fatal 11000 1726867170.47616: checking for max_fail_percentage 11000 1726867170.47617: done checking for max_fail_percentage 11000 1726867170.47617: checking to see if all hosts have failed and the running result is not ok 11000 1726867170.47617: done checking to see if all hosts have failed 11000 1726867170.47618: getting the remaining hosts for this loop 11000 1726867170.47619: done getting the remaining hosts for this loop 11000 1726867170.47621: getting the next task for host managed_node1 11000 1726867170.47623: done getting next task for host managed_node1 11000 1726867170.47624: ^ task is: TASK: meta (flush_handlers) 11000 1726867170.47625: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867170.47626: getting variables 11000 1726867170.47627: in VariableManager get_vars() 11000 1726867170.47636: Calling all_inventory to load vars for managed_node1 11000 1726867170.47638: Calling groups_inventory to load vars for managed_node1 11000 1726867170.47639: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867170.47642: Calling all_plugins_play to load vars for managed_node1 11000 1726867170.47644: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867170.47645: Calling groups_plugins_play to load vars for managed_node1 11000 1726867170.48304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867170.49134: done with get_vars() 11000 1726867170.49147: done getting variables 11000 1726867170.49181: in VariableManager get_vars() 11000 1726867170.49193: Calling all_inventory to load vars for managed_node1 11000 1726867170.49194: Calling groups_inventory to load vars for managed_node1 11000 1726867170.49196: Calling all_plugins_inventory to load vars for managed_node1 11000 1726867170.49199: Calling all_plugins_play to load vars for managed_node1 11000 1726867170.49204: Calling groups_plugins_inventory to load vars for managed_node1 11000 1726867170.49205: Calling groups_plugins_play to load vars for managed_node1 11000 1726867170.49828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11000 1726867170.50747: done with get_vars() 11000 1726867170.50766: done queuing things up, now waiting for results queue to drain 11000 1726867170.50767: results queue empty 11000 1726867170.50768: checking for any_errors_fatal 11000 1726867170.50769: done checking for any_errors_fatal 11000 1726867170.50769: checking for max_fail_percentage 11000 1726867170.50770: done checking for max_fail_percentage 11000 1726867170.50770: checking to see if all hosts have failed and the running result is not ok 11000 1726867170.50771: done checking to see if all hosts have failed 11000 1726867170.50771: getting the remaining hosts for this loop 11000 1726867170.50772: done getting the remaining hosts for this loop 11000 1726867170.50774: getting the next task for host managed_node1 11000 1726867170.50776: done getting next task for host managed_node1 11000 1726867170.50778: ^ task is: None 11000 1726867170.50779: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11000 1726867170.50780: done queuing things up, now waiting for results queue to drain 11000 1726867170.50781: results queue empty 11000 1726867170.50781: checking for any_errors_fatal 11000 1726867170.50782: done checking for any_errors_fatal 11000 1726867170.50782: checking for max_fail_percentage 11000 1726867170.50783: done checking for max_fail_percentage 11000 1726867170.50783: checking to see if all hosts have failed and the running result is not ok 11000 1726867170.50784: done checking to see if all hosts have failed 11000 1726867170.50785: getting the next task for host managed_node1 11000 1726867170.50786: done getting next task for host managed_node1 11000 1726867170.50787: ^ task is: None 11000 1726867170.50787: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=76 changed=3 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Friday 20 September 2024 17:19:30 -0400 (0:00:00.697) 0:00:32.151 ****** =============================================================================== Install dnsmasq --------------------------------------------------------- 2.01s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.96s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.75s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.69s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.56s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.22s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.92s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.90s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.86s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.85s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 0.82s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Verify DNS and network connectivity ------------------------------------- 0.70s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.68s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Install pgrep, sysctl --------------------------------------------------- 0.66s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.65s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check if system is ostree ----------------------------------------------- 0.63s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Get stat for interface test1 -------------------------------------------- 0.45s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Remove test interfaces -------------------------------------------------- 0.44s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Get stat for interface test2 -------------------------------------------- 0.44s /tmp/collections-Isn/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 11000 1726867170.50874: RUNNING CLEANUP