13271 1727203815.90067: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-bGV executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 13271 1727203815.90428: Added group all to inventory 13271 1727203815.90430: Added group ungrouped to inventory 13271 1727203815.90433: Group all now contains ungrouped 13271 1727203815.90435: Examining possible inventory source: /tmp/network-zt6/inventory-rSl.yml 13271 1727203816.01933: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 13271 1727203816.01973: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 13271 1727203816.01992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 13271 1727203816.02032: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 13271 1727203816.02084: Loaded config def from plugin (inventory/script) 13271 1727203816.02087: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 13271 1727203816.02127: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 13271 1727203816.02195: Loaded config def from plugin (inventory/yaml) 13271 1727203816.02198: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 13271 1727203816.02261: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 13271 1727203816.02694: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 13271 1727203816.02697: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 13271 1727203816.02700: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 13271 1727203816.02706: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 13271 1727203816.02711: Loading data from /tmp/network-zt6/inventory-rSl.yml 13271 1727203816.02783: /tmp/network-zt6/inventory-rSl.yml was not parsable by auto 13271 1727203816.02848: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 13271 1727203816.02889: Loading data from /tmp/network-zt6/inventory-rSl.yml 13271 1727203816.02969: group all already in inventory 13271 1727203816.02978: set inventory_file for managed-node1 13271 1727203816.02982: set inventory_dir for managed-node1 13271 1727203816.02983: Added host managed-node1 to inventory 13271 1727203816.02986: Added host managed-node1 to group all 13271 1727203816.02987: set ansible_host for managed-node1 13271 1727203816.02987: set ansible_ssh_extra_args for managed-node1 13271 1727203816.02991: set inventory_file for managed-node2 13271 1727203816.02994: set inventory_dir for managed-node2 13271 1727203816.02994: Added host managed-node2 to inventory 13271 1727203816.02996: Added host managed-node2 to group all 13271 1727203816.02997: set ansible_host for managed-node2 13271 1727203816.02998: set ansible_ssh_extra_args for managed-node2 13271 1727203816.03001: set inventory_file for managed-node3 13271 1727203816.03003: set inventory_dir for managed-node3 13271 1727203816.03004: Added host managed-node3 to inventory 13271 1727203816.03005: Added host managed-node3 to group all 13271 1727203816.03005: set ansible_host for managed-node3 13271 1727203816.03006: set ansible_ssh_extra_args for managed-node3 13271 1727203816.03009: Reconcile groups and hosts in inventory. 13271 1727203816.03012: Group ungrouped now contains managed-node1 13271 1727203816.03014: Group ungrouped now contains managed-node2 13271 1727203816.03015: Group ungrouped now contains managed-node3 13271 1727203816.03095: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 13271 1727203816.03217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 13271 1727203816.03265: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 13271 1727203816.03295: Loaded config def from plugin (vars/host_group_vars) 13271 1727203816.03298: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 13271 1727203816.03305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 13271 1727203816.03313: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 13271 1727203816.03355: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 13271 1727203816.03698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203816.03793: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 13271 1727203816.03831: Loaded config def from plugin (connection/local) 13271 1727203816.03835: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 13271 1727203816.04492: Loaded config def from plugin (connection/paramiko_ssh) 13271 1727203816.04495: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 13271 1727203816.05768: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13271 1727203816.05812: Loaded config def from plugin (connection/psrp) 13271 1727203816.05815: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 13271 1727203816.07050: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13271 1727203816.07093: Loaded config def from plugin (connection/ssh) 13271 1727203816.07097: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 13271 1727203816.11090: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 13271 1727203816.11131: Loaded config def from plugin (connection/winrm) 13271 1727203816.11135: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 13271 1727203816.11167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 13271 1727203816.11340: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 13271 1727203816.11408: Loaded config def from plugin (shell/cmd) 13271 1727203816.11411: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 13271 1727203816.11438: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 13271 1727203816.11645: Loaded config def from plugin (shell/powershell) 13271 1727203816.11647: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 13271 1727203816.11830: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 13271 1727203816.12172: Loaded config def from plugin (shell/sh) 13271 1727203816.12174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 13271 1727203816.12210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 13271 1727203816.12327: Loaded config def from plugin (become/runas) 13271 1727203816.12330: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 13271 1727203816.12511: Loaded config def from plugin (become/su) 13271 1727203816.12514: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 13271 1727203816.12859: Loaded config def from plugin (become/sudo) 13271 1727203816.12861: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 13271 1727203816.12966: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 13271 1727203816.13665: in VariableManager get_vars() 13271 1727203816.13690: done with get_vars() 13271 1727203816.13923: trying /usr/local/lib/python3.12/site-packages/ansible/modules 13271 1727203816.19408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 13271 1727203816.19523: in VariableManager get_vars() 13271 1727203816.19529: done with get_vars() 13271 1727203816.19532: variable 'playbook_dir' from source: magic vars 13271 1727203816.19533: variable 'ansible_playbook_python' from source: magic vars 13271 1727203816.19534: variable 'ansible_config_file' from source: magic vars 13271 1727203816.19534: variable 'groups' from source: magic vars 13271 1727203816.19535: variable 'omit' from source: magic vars 13271 1727203816.19536: variable 'ansible_version' from source: magic vars 13271 1727203816.19536: variable 'ansible_check_mode' from source: magic vars 13271 1727203816.19537: variable 'ansible_diff_mode' from source: magic vars 13271 1727203816.19538: variable 'ansible_forks' from source: magic vars 13271 1727203816.19538: variable 'ansible_inventory_sources' from source: magic vars 13271 1727203816.19539: variable 'ansible_skip_tags' from source: magic vars 13271 1727203816.19540: variable 'ansible_limit' from source: magic vars 13271 1727203816.19541: variable 'ansible_run_tags' from source: magic vars 13271 1727203816.19541: variable 'ansible_verbosity' from source: magic vars 13271 1727203816.19579: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 13271 1727203816.20253: in VariableManager get_vars() 13271 1727203816.20270: done with get_vars() 13271 1727203816.20281: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 13271 1727203816.21818: in VariableManager get_vars() 13271 1727203816.21832: done with get_vars() 13271 1727203816.21841: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13271 1727203816.21944: in VariableManager get_vars() 13271 1727203816.21960: done with get_vars() 13271 1727203816.22103: in VariableManager get_vars() 13271 1727203816.22116: done with get_vars() 13271 1727203816.22124: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13271 1727203816.22193: in VariableManager get_vars() 13271 1727203816.22207: done with get_vars() 13271 1727203816.22478: in VariableManager get_vars() 13271 1727203816.22492: done with get_vars() 13271 1727203816.22497: variable 'omit' from source: magic vars 13271 1727203816.22515: variable 'omit' from source: magic vars 13271 1727203816.22549: in VariableManager get_vars() 13271 1727203816.22560: done with get_vars() 13271 1727203816.22607: in VariableManager get_vars() 13271 1727203816.22620: done with get_vars() 13271 1727203816.22651: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13271 1727203816.22856: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13271 1727203816.23186: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13271 1727203816.24132: in VariableManager get_vars() 13271 1727203816.24151: done with get_vars() 13271 1727203816.25341: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 13271 1727203816.25477: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13271 1727203816.27018: in VariableManager get_vars() 13271 1727203816.27035: done with get_vars() 13271 1727203816.27044: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 13271 1727203816.27210: in VariableManager get_vars() 13271 1727203816.27228: done with get_vars() 13271 1727203816.27339: in VariableManager get_vars() 13271 1727203816.27354: done with get_vars() 13271 1727203816.27622: in VariableManager get_vars() 13271 1727203816.27639: done with get_vars() 13271 1727203816.27643: variable 'omit' from source: magic vars 13271 1727203816.27666: variable 'omit' from source: magic vars 13271 1727203816.27809: in VariableManager get_vars() 13271 1727203816.27822: done with get_vars() 13271 1727203816.27841: in VariableManager get_vars() 13271 1727203816.27857: done with get_vars() 13271 1727203816.27889: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 13271 1727203816.28024: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 13271 1727203816.30138: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 13271 1727203816.30626: in VariableManager get_vars() 13271 1727203816.30650: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13271 1727203816.33305: in VariableManager get_vars() 13271 1727203816.33319: done with get_vars() 13271 1727203816.33325: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 13271 1727203816.33641: in VariableManager get_vars() 13271 1727203816.33654: done with get_vars() 13271 1727203816.33695: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 13271 1727203816.33706: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 13271 1727203816.33871: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 13271 1727203816.33966: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 13271 1727203816.33968: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 13271 1727203816.33990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 13271 1727203816.34006: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 13271 1727203816.34106: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 13271 1727203816.34143: Loaded config def from plugin (callback/default) 13271 1727203816.34145: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13271 1727203816.35409: Loaded config def from plugin (callback/junit) 13271 1727203816.35412: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13271 1727203816.35458: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 13271 1727203816.35527: Loaded config def from plugin (callback/minimal) 13271 1727203816.35530: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13271 1727203816.35568: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 13271 1727203816.35633: Loaded config def from plugin (callback/tree) 13271 1727203816.35635: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 13271 1727203816.35764: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 13271 1727203816.35766: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-bGV/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 13271 1727203816.35794: in VariableManager get_vars() 13271 1727203816.35807: done with get_vars() 13271 1727203816.35813: in VariableManager get_vars() 13271 1727203816.35821: done with get_vars() 13271 1727203816.35825: variable 'omit' from source: magic vars 13271 1727203816.35864: in VariableManager get_vars() 13271 1727203816.35879: done with get_vars() 13271 1727203816.35899: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 13271 1727203816.36440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 13271 1727203816.36493: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 13271 1727203816.36519: getting the remaining hosts for this loop 13271 1727203816.36520: done getting the remaining hosts for this loop 13271 1727203816.36522: getting the next task for host managed-node1 13271 1727203816.36525: done getting next task for host managed-node1 13271 1727203816.36526: ^ task is: TASK: Gathering Facts 13271 1727203816.36527: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203816.36529: getting variables 13271 1727203816.36530: in VariableManager get_vars() 13271 1727203816.36537: Calling all_inventory to load vars for managed-node1 13271 1727203816.36539: Calling groups_inventory to load vars for managed-node1 13271 1727203816.36541: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203816.36549: Calling all_plugins_play to load vars for managed-node1 13271 1727203816.36555: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203816.36557: Calling groups_plugins_play to load vars for managed-node1 13271 1727203816.36583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203816.36616: done with get_vars() 13271 1727203816.36621: done getting variables 13271 1727203816.36669: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Tuesday 24 September 2024 14:50:16 -0400 (0:00:00.010) 0:00:00.010 ***** 13271 1727203816.36685: entering _queue_task() for managed-node1/gather_facts 13271 1727203816.36686: Creating lock for gather_facts 13271 1727203816.36970: worker is 1 (out of 1 available) 13271 1727203816.36982: exiting _queue_task() for managed-node1/gather_facts 13271 1727203816.36993: done queuing things up, now waiting for results queue to drain 13271 1727203816.36995: waiting for pending results... 13271 1727203816.37127: running TaskExecutor() for managed-node1/TASK: Gathering Facts 13271 1727203816.37179: in run() - task 028d2410-947f-2a40-12ba-0000000000cc 13271 1727203816.37191: variable 'ansible_search_path' from source: unknown 13271 1727203816.37219: calling self._execute() 13271 1727203816.37266: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203816.37270: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203816.37278: variable 'omit' from source: magic vars 13271 1727203816.37342: variable 'omit' from source: magic vars 13271 1727203816.37472: variable 'omit' from source: magic vars 13271 1727203816.37477: variable 'omit' from source: magic vars 13271 1727203816.37480: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203816.37484: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203816.37486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203816.37489: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203816.37491: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203816.37506: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203816.37509: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203816.37511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203816.37582: Set connection var ansible_connection to ssh 13271 1727203816.37585: Set connection var ansible_shell_type to sh 13271 1727203816.37593: Set connection var ansible_timeout to 10 13271 1727203816.37598: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203816.37603: Set connection var ansible_pipelining to False 13271 1727203816.37608: Set connection var ansible_shell_executable to /bin/sh 13271 1727203816.37627: variable 'ansible_shell_executable' from source: unknown 13271 1727203816.37657: variable 'ansible_connection' from source: unknown 13271 1727203816.37663: variable 'ansible_module_compression' from source: unknown 13271 1727203816.37666: variable 'ansible_shell_type' from source: unknown 13271 1727203816.37668: variable 'ansible_shell_executable' from source: unknown 13271 1727203816.37671: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203816.37673: variable 'ansible_pipelining' from source: unknown 13271 1727203816.37678: variable 'ansible_timeout' from source: unknown 13271 1727203816.37680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203816.38010: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203816.38014: variable 'omit' from source: magic vars 13271 1727203816.38016: starting attempt loop 13271 1727203816.38019: running the handler 13271 1727203816.38021: variable 'ansible_facts' from source: unknown 13271 1727203816.38024: _low_level_execute_command(): starting 13271 1727203816.38026: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203816.39000: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203816.39116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203816.39134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203816.39211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203816.39255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203816.39294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203816.39307: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203816.39398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203816.41223: stdout chunk (state=3): >>>/root <<< 13271 1727203816.41378: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203816.41401: stdout chunk (state=3): >>><<< 13271 1727203816.41422: stderr chunk (state=3): >>><<< 13271 1727203816.41463: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203816.41494: _low_level_execute_command(): starting 13271 1727203816.41532: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789 `" && echo ansible-tmp-1727203816.4147036-13441-33177150495789="` echo /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789 `" ) && sleep 0' 13271 1727203816.42659: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203816.42766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203816.42787: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203816.42807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203816.42923: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203816.45096: stdout chunk (state=3): >>>ansible-tmp-1727203816.4147036-13441-33177150495789=/root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789 <<< 13271 1727203816.45242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203816.45246: stdout chunk (state=3): >>><<< 13271 1727203816.45249: stderr chunk (state=3): >>><<< 13271 1727203816.45281: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203816.4147036-13441-33177150495789=/root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203816.45408: variable 'ansible_module_compression' from source: unknown 13271 1727203816.45411: ANSIBALLZ: Using generic lock for ansible.legacy.setup 13271 1727203816.45414: ANSIBALLZ: Acquiring lock 13271 1727203816.45416: ANSIBALLZ: Lock acquired: 140497830695696 13271 1727203816.45418: ANSIBALLZ: Creating module 13271 1727203816.70355: ANSIBALLZ: Writing module into payload 13271 1727203816.70512: ANSIBALLZ: Writing module 13271 1727203816.70540: ANSIBALLZ: Renaming module 13271 1727203816.70551: ANSIBALLZ: Done creating module 13271 1727203816.70596: variable 'ansible_facts' from source: unknown 13271 1727203816.70608: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203816.70623: _low_level_execute_command(): starting 13271 1727203816.70634: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 13271 1727203816.71238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203816.71255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203816.71269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203816.71291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203816.71308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203816.71321: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203816.71335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203816.71354: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203816.71442: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203816.71467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203816.71692: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203816.73397: stdout chunk (state=3): >>>PLATFORM <<< 13271 1727203816.73512: stdout chunk (state=3): >>>Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 13271 1727203816.73689: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203816.73693: stdout chunk (state=3): >>><<< 13271 1727203816.73696: stderr chunk (state=3): >>><<< 13271 1727203816.73717: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203816.73734 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 13271 1727203816.74098: _low_level_execute_command(): starting 13271 1727203816.74102: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 13271 1727203816.74145: Sending initial data 13271 1727203816.74149: Sent initial data (1181 bytes) 13271 1727203816.75661: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203816.75677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203816.75690: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203816.75793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203816.75816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203816.75988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203816.79704: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 13271 1727203816.80202: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203816.80216: stdout chunk (state=3): >>><<< 13271 1727203816.80228: stderr chunk (state=3): >>><<< 13271 1727203816.80248: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203816.80583: variable 'ansible_facts' from source: unknown 13271 1727203816.80586: variable 'ansible_facts' from source: unknown 13271 1727203816.80588: variable 'ansible_module_compression' from source: unknown 13271 1727203816.80618: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13271 1727203816.80652: variable 'ansible_facts' from source: unknown 13271 1727203816.81093: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/AnsiballZ_setup.py 13271 1727203816.81318: Sending initial data 13271 1727203816.81322: Sent initial data (153 bytes) 13271 1727203816.81891: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203816.81963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203816.82013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203816.82040: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203816.82070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203816.82181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203816.83942: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203816.84046: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203816.84120: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp_nj2ob72 /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/AnsiballZ_setup.py <<< 13271 1727203816.84131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/AnsiballZ_setup.py" <<< 13271 1727203816.84240: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp_nj2ob72" to remote "/root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/AnsiballZ_setup.py" <<< 13271 1727203816.86118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203816.86122: stdout chunk (state=3): >>><<< 13271 1727203816.86125: stderr chunk (state=3): >>><<< 13271 1727203816.86127: done transferring module to remote 13271 1727203816.86129: _low_level_execute_command(): starting 13271 1727203816.86265: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/ /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/AnsiballZ_setup.py && sleep 0' 13271 1727203816.86992: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203816.87038: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203816.87131: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203816.87150: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203816.87174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203816.87293: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203816.89329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203816.89333: stdout chunk (state=3): >>><<< 13271 1727203816.89335: stderr chunk (state=3): >>><<< 13271 1727203816.89364: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203816.89381: _low_level_execute_command(): starting 13271 1727203816.89454: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/AnsiballZ_setup.py && sleep 0' 13271 1727203816.90594: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203816.90634: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203816.90664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203816.90681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203816.90816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203816.93203: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 13271 1727203816.93234: stdout chunk (state=3): >>>import _imp # builtin <<< 13271 1727203816.93269: stdout chunk (state=3): >>>import '_thread' # <<< 13271 1727203816.93283: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 13271 1727203816.93346: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 13271 1727203816.93379: stdout chunk (state=3): >>>import 'posix' # <<< 13271 1727203816.93424: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13271 1727203816.93454: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 13271 1727203816.93507: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203816.93536: stdout chunk (state=3): >>>import '_codecs' # <<< 13271 1727203816.93552: stdout chunk (state=3): >>>import 'codecs' # <<< 13271 1727203816.93598: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 13271 1727203816.93631: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28515104d0> <<< 13271 1727203816.93662: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28514dfb30> <<< 13271 1727203816.93684: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851512a50> <<< 13271 1727203816.93734: stdout chunk (state=3): >>>import '_signal' # <<< 13271 1727203816.93738: stdout chunk (state=3): >>>import '_abc' # <<< 13271 1727203816.93827: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 13271 1727203816.93830: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 13271 1727203816.94057: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 13271 1727203816.94087: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512c1130> <<< 13271 1727203816.94213: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 13271 1727203816.94216: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512c2060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13271 1727203816.94617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 13271 1727203816.94643: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 13271 1727203816.94661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203816.94685: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13271 1727203816.94732: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13271 1727203816.94746: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 13271 1727203816.94784: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 13271 1727203816.94898: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512ffe90> <<< 13271 1727203816.94902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 13271 1727203816.94905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 13271 1727203816.94907: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fff50> <<< 13271 1727203816.94930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 13271 1727203816.94934: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 13271 1727203816.95004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203816.95010: stdout chunk (state=3): >>>import 'itertools' # <<< 13271 1727203816.95062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851337890> <<< 13271 1727203816.95167: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851337f20> <<< 13271 1727203816.95233: stdout chunk (state=3): >>>import '_collections' # <<< 13271 1727203816.95280: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851317b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851315280> <<< 13271 1727203816.95380: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fd040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 13271 1727203816.95401: stdout chunk (state=3): >>>import '_sre' # <<< 13271 1727203816.95515: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285135b770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285135a390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851316120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fe900> <<< 13271 1727203816.95592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 13271 1727203816.95595: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138c830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fc2c0> <<< 13271 1727203816.95737: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285138cce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138cb90> <<< 13271 1727203816.95747: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203816.95793: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285138cf80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203816.95820: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 13271 1727203816.95847: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138d640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138d310> import 'importlib.machinery' # <<< 13271 1727203816.95895: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138e510> <<< 13271 1727203816.95902: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 13271 1727203816.95931: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 13271 1727203816.95961: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 13271 1727203816.96008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a4710> <<< 13271 1727203816.96061: stdout chunk (state=3): >>>import 'errno' # <<< 13271 1727203816.96065: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28513a5dc0> <<< 13271 1727203816.96122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 13271 1727203816.96125: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 13271 1727203816.96140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a6c60> <<< 13271 1727203816.96188: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28513a7290> <<< 13271 1727203816.96212: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a61b0> <<< 13271 1727203816.96228: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 13271 1727203816.96269: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28513a7d10> <<< 13271 1727203816.96289: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a7440> <<< 13271 1727203816.96341: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138e480> <<< 13271 1727203816.96358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 13271 1727203816.96389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 13271 1727203816.96400: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 13271 1727203816.96427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 13271 1727203816.96469: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510afc80> <<< 13271 1727203816.96498: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 13271 1727203816.96535: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203816.96562: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510d8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d86e0> <<< 13271 1727203816.96590: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 13271 1727203816.96668: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203816.96800: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d9010> <<< 13271 1727203816.96953: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d9970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510d88c0> <<< 13271 1727203816.96980: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510ade20> <<< 13271 1727203816.97014: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 13271 1727203816.97035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 13271 1727203816.97065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 13271 1727203816.97090: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510dad20> <<< 13271 1727203816.97116: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510d8e90> <<< 13271 1727203816.97150: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138ec30> <<< 13271 1727203816.97153: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13271 1727203816.97227: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203816.97242: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 13271 1727203816.97265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 13271 1727203816.97301: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851107080> <<< 13271 1727203816.97358: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 13271 1727203816.97389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203816.97391: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 13271 1727203816.97414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 13271 1727203816.97452: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285112b410> <<< 13271 1727203816.97485: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 13271 1727203816.97528: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13271 1727203816.97594: stdout chunk (state=3): >>>import 'ntpath' # <<< 13271 1727203816.97619: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203816.97631: stdout chunk (state=3): >>>import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511881d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 13271 1727203816.97669: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 13271 1727203816.97699: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13271 1727203816.97742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13271 1727203816.97833: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285118a930> <<< 13271 1727203816.97921: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511882f0> <<< 13271 1727203816.97951: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511551f0> <<< 13271 1727203816.97994: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511559a0> <<< 13271 1727203816.98021: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285112a210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510dbc50> <<< 13271 1727203816.98207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 13271 1727203816.98222: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f285112a330> <<< 13271 1727203816.98573: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_6px_ruct/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 13271 1727203816.98871: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13271 1727203816.98912: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850ffaff0> <<< 13271 1727203816.98915: stdout chunk (state=3): >>>import '_typing' # <<< 13271 1727203816.99118: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850fd9ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850fd9070> <<< 13271 1727203816.99121: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203816.99167: stdout chunk (state=3): >>>import 'ansible' # <<< 13271 1727203816.99170: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203816.99207: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203816.99228: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 13271 1727203816.99239: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.00697: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.01952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850ff8e90> <<< 13271 1727203817.01998: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203817.02002: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 13271 1727203817.02019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 13271 1727203817.02042: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13271 1727203817.02084: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2851032960> <<< 13271 1727203817.02113: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510326f0> <<< 13271 1727203817.02152: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851032000> <<< 13271 1727203817.02190: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 13271 1727203817.02224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13271 1727203817.02237: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851032a50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850ffba10> import 'atexit' # <<< 13271 1727203817.02274: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510336e0> <<< 13271 1727203817.02309: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2851033920> <<< 13271 1727203817.02337: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13271 1727203817.02402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 13271 1727203817.02414: stdout chunk (state=3): >>>import '_locale' # <<< 13271 1727203817.02448: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851033e30> <<< 13271 1727203817.02484: stdout chunk (state=3): >>>import 'pwd' # <<< 13271 1727203817.02487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 13271 1727203817.02517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 13271 1727203817.02552: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850929be0> <<< 13271 1727203817.02589: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285092b800> <<< 13271 1727203817.02616: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 13271 1727203817.02630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13271 1727203817.02681: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509301d0> <<< 13271 1727203817.02692: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13271 1727203817.02739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 13271 1727203817.02769: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509310d0> <<< 13271 1727203817.02785: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13271 1727203817.02841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 13271 1727203817.02851: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 13271 1727203817.03155: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850933dd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d82f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850932090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13271 1727203817.03185: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 13271 1727203817.03219: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 13271 1727203817.03225: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850937c50> <<< 13271 1727203817.03234: stdout chunk (state=3): >>>import '_tokenize' # <<< 13271 1727203817.03312: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850936720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850936480> <<< 13271 1727203817.03333: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 13271 1727203817.03357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13271 1727203817.03420: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509369f0> <<< 13271 1727203817.03454: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509325a0> <<< 13271 1727203817.03489: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203817.03495: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285097bef0> <<< 13271 1727203817.03522: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097b920> <<< 13271 1727203817.03541: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 13271 1727203817.03565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 13271 1727203817.03589: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 13271 1727203817.03638: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285097da60><<< 13271 1727203817.03650: stdout chunk (state=3): >>> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097d820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 13271 1727203817.03686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 13271 1727203817.03743: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285097ffe0> <<< 13271 1727203817.03759: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097e150> <<< 13271 1727203817.03771: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13271 1727203817.03893: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 13271 1727203817.03910: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850983830> <<< 13271 1727203817.04047: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850980200> <<< 13271 1727203817.04110: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28509845c0> <<< 13271 1727203817.04153: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850984860> <<< 13271 1727203817.04218: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850984b60> <<< 13271 1727203817.04221: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097c1a0> <<< 13271 1727203817.04247: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 13271 1727203817.04280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 13271 1727203817.04313: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203817.04344: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850810320> <<< 13271 1727203817.04512: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850811580> <<< 13271 1727203817.04541: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850986ab0> <<< 13271 1727203817.04580: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850987e30> <<< 13271 1727203817.04591: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850986690> # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.04618: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 13271 1727203817.04709: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.04818: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 13271 1727203817.04854: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.04871: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 13271 1727203817.04996: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.05114: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.05681: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.06262: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 13271 1727203817.06301: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 13271 1727203817.06327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203817.06387: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28508157c0> <<< 13271 1727203817.06469: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13271 1727203817.06496: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850816630> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850811790> <<< 13271 1727203817.06566: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 13271 1727203817.06569: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.06607: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 13271 1727203817.06610: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.06769: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.07089: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850816660> # zipimport: zlib available <<< 13271 1727203817.07438: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.07914: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.07988: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08071: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 13271 1727203817.08115: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08161: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 13271 1727203817.08172: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08231: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08357: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 13271 1727203817.08362: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08365: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 13271 1727203817.08410: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08444: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 13271 1727203817.08457: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08698: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.08946: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13271 1727203817.09018: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 13271 1727203817.09033: stdout chunk (state=3): >>>import '_ast' # <<< 13271 1727203817.09112: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508178f0> <<< 13271 1727203817.09115: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09193: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09274: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 13271 1727203817.09304: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 13271 1727203817.09307: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09354: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09402: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 13271 1727203817.09405: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09447: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09502: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09551: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09629: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13271 1727203817.09670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203817.09778: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850822030> <<< 13271 1727203817.09820: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285081f470> <<< 13271 1727203817.09860: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 13271 1727203817.09874: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09937: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.09996: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.10030: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.10088: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203817.10114: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13271 1727203817.10128: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13271 1727203817.10147: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13271 1727203817.10219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 13271 1727203817.10256: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 13271 1727203817.10259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 13271 1727203817.10315: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285090a9f0> <<< 13271 1727203817.10365: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509fe6c0> <<< 13271 1727203817.10448: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850821eb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850930140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 13271 1727203817.10468: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.10520: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.10538: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13271 1727203817.10587: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 13271 1727203817.10621: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 13271 1727203817.10649: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.10812: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.10814: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.10980: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.10993: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 13271 1727203817.11084: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.11134: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.11155: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.11207: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 13271 1727203817.11216: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.11396: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.11570: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.11614: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.11686: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203817.11715: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 13271 1727203817.11763: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 13271 1727203817.11808: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b6540> <<< 13271 1727203817.11811: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 13271 1727203817.11834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 13271 1727203817.11847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 13271 1727203817.11906: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 13271 1727203817.11921: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 13271 1727203817.11946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e4230> <<< 13271 1727203817.11983: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203817.12005: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28504e4590> <<< 13271 1727203817.12068: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508a3500> <<< 13271 1727203817.12072: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b7080> <<< 13271 1727203817.12121: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b4c80> <<< 13271 1727203817.12126: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b46e0> <<< 13271 1727203817.12143: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 13271 1727203817.12224: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 13271 1727203817.12272: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 13271 1727203817.12277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 13271 1727203817.12288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 13271 1727203817.12321: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28504e7590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e6e40> <<< 13271 1727203817.12374: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203817.12390: stdout chunk (state=3): >>># extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28504e7020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e6270> <<< 13271 1727203817.12435: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 13271 1727203817.12703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e7770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285054a210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e4c50> <<< 13271 1727203817.12726: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b4890> import 'ansible.module_utils.facts.timeout' # <<< 13271 1727203817.12751: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 13271 1727203817.12758: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.12771: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 13271 1727203817.12804: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.12865: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13047: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 13271 1727203817.13104: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 13271 1727203817.13116: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13159: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 13271 1727203817.13165: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13241: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13269: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 13271 1727203817.13343: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 13271 1727203817.13384: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13492: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13495: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13585: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.13617: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 13271 1727203817.13620: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14119: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 13271 1727203817.14604: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14660: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14716: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14754: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14800: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 13271 1727203817.14813: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14835: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14867: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 13271 1727203817.14881: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14933: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.14987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 13271 1727203817.15024: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.15038: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.15068: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 13271 1727203817.15105: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.15148: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 13271 1727203817.15223: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.15315: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 13271 1727203817.15357: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285054a360> <<< 13271 1727203817.15403: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 13271 1727203817.15680: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285054af60> <<< 13271 1727203817.15686: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # <<< 13271 1727203817.15702: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 13271 1727203817.15773: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.15861: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 13271 1727203817.15879: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.16028: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 13271 1727203817.16059: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.16118: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 13271 1727203817.16192: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 13271 1727203817.16248: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203817.16310: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203817.16322: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850582450> <<< 13271 1727203817.16696: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850810200> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 13271 1727203817.16737: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.16824: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.16992: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.17073: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 13271 1727203817.17130: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.17169: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 13271 1727203817.17185: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.17236: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.17267: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 13271 1727203817.17357: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850595dc0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28505959a0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 13271 1727203817.17373: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 13271 1727203817.17388: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.17426: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.17565: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 13271 1727203817.17647: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.17794: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 13271 1727203817.17822: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.17903: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.18005: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.18152: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.18289: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.18492: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 13271 1727203817.18663: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.18691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 13271 1727203817.18732: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.18762: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.19360: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.19912: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 13271 1727203817.19915: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.20072: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.20132: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 13271 1727203817.20182: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.20241: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.20366: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 13271 1727203817.20740: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 13271 1727203817.20802: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.20805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 13271 1727203817.21094: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.21223: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.21439: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 13271 1727203817.21442: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # <<< 13271 1727203817.21445: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.21478: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.21583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 13271 1727203817.21586: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.21659: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.21822: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 13271 1727203817.22001: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.22025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 13271 1727203817.22039: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.22465: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.22585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 13271 1727203817.22597: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.22664: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.22775: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.22802: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 13271 1727203817.22814: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.22844: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23096: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 13271 1727203817.23099: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.23134: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 13271 1727203817.23152: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.23167: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 13271 1727203817.23226: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23267: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 13271 1727203817.23327: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.23366: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23430: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23579: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 13271 1727203817.23586: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 13271 1727203817.23588: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23650: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23693: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 13271 1727203817.23705: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.23909: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.24289: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203817.24293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 13271 1727203817.24320: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # <<< 13271 1727203817.24324: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.24408: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.24501: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 13271 1727203817.24505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 13271 1727203817.24507: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.24712: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 13271 1727203817.24799: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203817.25507: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285032e8d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285032f230> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850327d40> <<< 13271 1727203817.37268: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 13271 1727203817.37273: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' <<< 13271 1727203817.37280: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850374380> <<< 13271 1727203817.37336: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850375310> <<< 13271 1727203817.37671: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850376bd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850376240> <<< 13271 1727203817.37764: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 13271 1727203817.62132: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "17", "epoch": "1727203817", "epoch_int": "1727203817", "date": "2024-09-24", "time": "14:50:17", "iso8601_micro": "2024-09-24T18:50:17.255721Z", "iso8601": "2024-09-24T18:50:17Z", "iso8601_basic": "20240924T145017255721", "iso8601_basic_short": "20240924T145017", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2930, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 601, "free": 2930}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 408, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785722880, "block_size": 4096, "block_total": 65519099, "block_available": 63912530, "block_used": 1606569, "inode_total": 131070960, "inode_available": 131027285, "inode_used": 43675, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.30810546875, "5m": 0.24267578125, "15m": 0.11962890625}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": <<< 13271 1727203817.62176: stdout chunk (state=3): >>>{"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13271 1727203817.62879: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv<<< 13271 1727203817.63032: stdout chunk (state=3): >>> # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp <<< 13271 1727203817.63054: stdout chunk (state=3): >>># cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd <<< 13271 1727203817.63057: stdout chunk (state=3): >>># cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six <<< 13271 1727203817.63059: stdout chunk (state=3): >>># destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters <<< 13271 1727203817.63186: stdout chunk (state=3): >>># destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other <<< 13271 1727203817.63193: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 13271 1727203817.63251: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 13271 1727203817.63813: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 13271 1727203817.63878: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath <<< 13271 1727203817.63923: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder <<< 13271 1727203817.64014: stdout chunk (state=3): >>># destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 13271 1727203817.64039: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 13271 1727203817.64056: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 13271 1727203817.64108: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 13271 1727203817.64153: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 13271 1727203817.64375: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 13271 1727203817.64381: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing <<< 13271 1727203817.64384: stdout chunk (state=3): >>># destroy array # destroy multiprocessing.dummy.connection <<< 13271 1727203817.64456: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 13271 1727203817.64484: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 13271 1727203817.64535: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib <<< 13271 1727203817.64628: stdout chunk (state=3): >>># cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 13271 1727203817.64633: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools <<< 13271 1727203817.64642: stdout chunk (state=3): >>># cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 13271 1727203817.64704: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 13271 1727203817.64806: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13271 1727203817.64958: stdout chunk (state=3): >>># destroy sys.monitoring <<< 13271 1727203817.64961: stdout chunk (state=3): >>># destroy _socket <<< 13271 1727203817.65030: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 13271 1727203817.65105: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 13271 1727203817.65126: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 13271 1727203817.65230: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules <<< 13271 1727203817.65234: stdout chunk (state=3): >>># destroy _frozen_importlib <<< 13271 1727203817.65790: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13271 1727203817.65816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203817.65853: stderr chunk (state=3): >>><<< 13271 1727203817.65862: stdout chunk (state=3): >>><<< 13271 1727203817.66307: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28515104d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28514dfb30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851512a50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512c1130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512c2060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512ffe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851337890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851337f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851317b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851315280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fd040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285135b770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285135a390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851316120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fe900> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138c830> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fc2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285138cce0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138cb90> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285138cf80> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28512fade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138d640> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138d310> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138e510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a4710> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28513a5dc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a6c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28513a7290> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a61b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28513a7d10> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28513a7440> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138e480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510afc80> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d86b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510d8440> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d86e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d9010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d9970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510d88c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510ade20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510dad20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510d8e90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285138ec30> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851107080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285112b410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511881d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285118a930> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511882f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511551f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28511559a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285112a210> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510dbc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f285112a330> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_6px_ruct/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850ffaff0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850fd9ee0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850fd9070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850ff8e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2851032960> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28510326f0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851032000> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851032a50> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850ffba10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510336e0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2851033920> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2851033e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850929be0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285092b800> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509301d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509310d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850933dd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28510d82f0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850932090> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850937c50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850936720> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850936480> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509369f0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509325a0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285097bef0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097b920> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285097da60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097d820> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285097ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097e150> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850983830> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850980200> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28509845c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850984860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850984b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285097c1a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850810320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850811580> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850986ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850987e30> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850986690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28508157c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850816630> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850811790> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850816660> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508178f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850822030> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285081f470> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285090a9f0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28509fe6c0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850821eb0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850930140> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b6540> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e4230> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28504e4590> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508a3500> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b7080> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b4c80> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b46e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28504e7590> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e6e40> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f28504e7020> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e6270> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e7770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285054a210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28504e4c50> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28508b4890> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285054a360> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285054af60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850582450> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850810200> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f2850595dc0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f28505959a0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f285032e8d0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f285032f230> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850327d40> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850374380> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850375310> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850376bd0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f2850376240> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "17", "epoch": "1727203817", "epoch_int": "1727203817", "date": "2024-09-24", "time": "14:50:17", "iso8601_micro": "2024-09-24T18:50:17.255721Z", "iso8601": "2024-09-24T18:50:17Z", "iso8601_basic": "20240924T145017255721", "iso8601_basic_short": "20240924T145017", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2930, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 601, "free": 2930}, "nocache": {"free": 3282, "used": 249}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 408, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785722880, "block_size": 4096, "block_total": 65519099, "block_available": 63912530, "block_used": 1606569, "inode_total": 131070960, "inode_available": 131027285, "inode_used": 43675, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.30810546875, "5m": 0.24267578125, "15m": 0.11962890625}, "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 13271 1727203817.69728: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203817.69983: _low_level_execute_command(): starting 13271 1727203817.69987: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203816.4147036-13441-33177150495789/ > /dev/null 2>&1 && sleep 0' 13271 1727203817.71180: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203817.71294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203817.71393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203817.71610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203817.73597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203817.73659: stderr chunk (state=3): >>><<< 13271 1727203817.73668: stdout chunk (state=3): >>><<< 13271 1727203817.73693: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203817.73708: handler run complete 13271 1727203817.73833: variable 'ansible_facts' from source: unknown 13271 1727203817.73931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203817.74237: variable 'ansible_facts' from source: unknown 13271 1727203817.74323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203817.74448: attempt loop complete, returning result 13271 1727203817.74458: _execute() done 13271 1727203817.74466: dumping result to json 13271 1727203817.74505: done dumping result, returning 13271 1727203817.74519: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-2a40-12ba-0000000000cc] 13271 1727203817.74528: sending task result for task 028d2410-947f-2a40-12ba-0000000000cc 13271 1727203817.75209: done sending task result for task 028d2410-947f-2a40-12ba-0000000000cc 13271 1727203817.75212: WORKER PROCESS EXITING ok: [managed-node1] 13271 1727203817.75801: no more pending results, returning what we have 13271 1727203817.75804: results queue empty 13271 1727203817.75805: checking for any_errors_fatal 13271 1727203817.75806: done checking for any_errors_fatal 13271 1727203817.75807: checking for max_fail_percentage 13271 1727203817.75811: done checking for max_fail_percentage 13271 1727203817.75811: checking to see if all hosts have failed and the running result is not ok 13271 1727203817.75812: done checking to see if all hosts have failed 13271 1727203817.75813: getting the remaining hosts for this loop 13271 1727203817.75814: done getting the remaining hosts for this loop 13271 1727203817.75818: getting the next task for host managed-node1 13271 1727203817.75829: done getting next task for host managed-node1 13271 1727203817.75831: ^ task is: TASK: meta (flush_handlers) 13271 1727203817.75833: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203817.75836: getting variables 13271 1727203817.75837: in VariableManager get_vars() 13271 1727203817.75861: Calling all_inventory to load vars for managed-node1 13271 1727203817.75864: Calling groups_inventory to load vars for managed-node1 13271 1727203817.75866: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203817.75877: Calling all_plugins_play to load vars for managed-node1 13271 1727203817.75880: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203817.75883: Calling groups_plugins_play to load vars for managed-node1 13271 1727203817.76165: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203817.76603: done with get_vars() 13271 1727203817.76628: done getting variables 13271 1727203817.76712: in VariableManager get_vars() 13271 1727203817.76721: Calling all_inventory to load vars for managed-node1 13271 1727203817.76723: Calling groups_inventory to load vars for managed-node1 13271 1727203817.76727: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203817.76732: Calling all_plugins_play to load vars for managed-node1 13271 1727203817.76734: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203817.76737: Calling groups_plugins_play to load vars for managed-node1 13271 1727203817.76889: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203817.77096: done with get_vars() 13271 1727203817.77107: done queuing things up, now waiting for results queue to drain 13271 1727203817.77110: results queue empty 13271 1727203817.77110: checking for any_errors_fatal 13271 1727203817.77113: done checking for any_errors_fatal 13271 1727203817.77114: checking for max_fail_percentage 13271 1727203817.77115: done checking for max_fail_percentage 13271 1727203817.77115: checking to see if all hosts have failed and the running result is not ok 13271 1727203817.77116: done checking to see if all hosts have failed 13271 1727203817.77127: getting the remaining hosts for this loop 13271 1727203817.77129: done getting the remaining hosts for this loop 13271 1727203817.77135: getting the next task for host managed-node1 13271 1727203817.77145: done getting next task for host managed-node1 13271 1727203817.77148: ^ task is: TASK: Include the task 'el_repo_setup.yml' 13271 1727203817.77149: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203817.77151: getting variables 13271 1727203817.77152: in VariableManager get_vars() 13271 1727203817.77159: Calling all_inventory to load vars for managed-node1 13271 1727203817.77164: Calling groups_inventory to load vars for managed-node1 13271 1727203817.77166: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203817.77170: Calling all_plugins_play to load vars for managed-node1 13271 1727203817.77172: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203817.77178: Calling groups_plugins_play to load vars for managed-node1 13271 1727203817.77320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203817.77500: done with get_vars() 13271 1727203817.77507: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Tuesday 24 September 2024 14:50:17 -0400 (0:00:01.408) 0:00:01.418 ***** 13271 1727203817.77582: entering _queue_task() for managed-node1/include_tasks 13271 1727203817.77584: Creating lock for include_tasks 13271 1727203817.78100: worker is 1 (out of 1 available) 13271 1727203817.78114: exiting _queue_task() for managed-node1/include_tasks 13271 1727203817.78125: done queuing things up, now waiting for results queue to drain 13271 1727203817.78127: waiting for pending results... 13271 1727203817.78592: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 13271 1727203817.78597: in run() - task 028d2410-947f-2a40-12ba-000000000006 13271 1727203817.78600: variable 'ansible_search_path' from source: unknown 13271 1727203817.78604: calling self._execute() 13271 1727203817.78607: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203817.78610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203817.78612: variable 'omit' from source: magic vars 13271 1727203817.78694: _execute() done 13271 1727203817.78703: dumping result to json 13271 1727203817.78711: done dumping result, returning 13271 1727203817.78720: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [028d2410-947f-2a40-12ba-000000000006] 13271 1727203817.78729: sending task result for task 028d2410-947f-2a40-12ba-000000000006 13271 1727203817.78865: no more pending results, returning what we have 13271 1727203817.78869: in VariableManager get_vars() 13271 1727203817.79016: Calling all_inventory to load vars for managed-node1 13271 1727203817.79018: Calling groups_inventory to load vars for managed-node1 13271 1727203817.79021: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203817.79030: Calling all_plugins_play to load vars for managed-node1 13271 1727203817.79032: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203817.79035: Calling groups_plugins_play to load vars for managed-node1 13271 1727203817.79238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203817.79427: done with get_vars() 13271 1727203817.79435: variable 'ansible_search_path' from source: unknown 13271 1727203817.79449: we have included files to process 13271 1727203817.79450: generating all_blocks data 13271 1727203817.79452: done generating all_blocks data 13271 1727203817.79453: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13271 1727203817.79455: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13271 1727203817.79458: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 13271 1727203817.79471: done sending task result for task 028d2410-947f-2a40-12ba-000000000006 13271 1727203817.79474: WORKER PROCESS EXITING 13271 1727203817.80134: in VariableManager get_vars() 13271 1727203817.80157: done with get_vars() 13271 1727203817.80174: done processing included file 13271 1727203817.80180: iterating over new_blocks loaded from include file 13271 1727203817.80181: in VariableManager get_vars() 13271 1727203817.80191: done with get_vars() 13271 1727203817.80193: filtering new block on tags 13271 1727203817.80205: done filtering new block on tags 13271 1727203817.80208: in VariableManager get_vars() 13271 1727203817.80216: done with get_vars() 13271 1727203817.80218: filtering new block on tags 13271 1727203817.80231: done filtering new block on tags 13271 1727203817.80234: in VariableManager get_vars() 13271 1727203817.80244: done with get_vars() 13271 1727203817.80245: filtering new block on tags 13271 1727203817.80258: done filtering new block on tags 13271 1727203817.80261: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 13271 1727203817.80266: extending task lists for all hosts with included blocks 13271 1727203817.80317: done extending task lists 13271 1727203817.80318: done processing included files 13271 1727203817.80319: results queue empty 13271 1727203817.80320: checking for any_errors_fatal 13271 1727203817.80321: done checking for any_errors_fatal 13271 1727203817.80322: checking for max_fail_percentage 13271 1727203817.80323: done checking for max_fail_percentage 13271 1727203817.80323: checking to see if all hosts have failed and the running result is not ok 13271 1727203817.80324: done checking to see if all hosts have failed 13271 1727203817.80325: getting the remaining hosts for this loop 13271 1727203817.80326: done getting the remaining hosts for this loop 13271 1727203817.80329: getting the next task for host managed-node1 13271 1727203817.80333: done getting next task for host managed-node1 13271 1727203817.80335: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 13271 1727203817.80337: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203817.80339: getting variables 13271 1727203817.80340: in VariableManager get_vars() 13271 1727203817.80348: Calling all_inventory to load vars for managed-node1 13271 1727203817.80350: Calling groups_inventory to load vars for managed-node1 13271 1727203817.80353: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203817.80358: Calling all_plugins_play to load vars for managed-node1 13271 1727203817.80360: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203817.80363: Calling groups_plugins_play to load vars for managed-node1 13271 1727203817.80534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203817.80987: done with get_vars() 13271 1727203817.80996: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:50:17 -0400 (0:00:00.035) 0:00:01.454 ***** 13271 1727203817.81112: entering _queue_task() for managed-node1/setup 13271 1727203817.81645: worker is 1 (out of 1 available) 13271 1727203817.81656: exiting _queue_task() for managed-node1/setup 13271 1727203817.81665: done queuing things up, now waiting for results queue to drain 13271 1727203817.81667: waiting for pending results... 13271 1727203817.82115: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 13271 1727203817.82168: in run() - task 028d2410-947f-2a40-12ba-0000000000dd 13271 1727203817.82188: variable 'ansible_search_path' from source: unknown 13271 1727203817.82194: variable 'ansible_search_path' from source: unknown 13271 1727203817.82240: calling self._execute() 13271 1727203817.82310: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203817.82338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203817.82341: variable 'omit' from source: magic vars 13271 1727203817.82908: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203817.85155: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203817.85175: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203817.85219: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203817.85284: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203817.85317: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203817.85411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203817.85485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203817.85490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203817.85538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203817.85558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203817.85813: variable 'ansible_facts' from source: unknown 13271 1727203817.85842: variable 'network_test_required_facts' from source: task vars 13271 1727203817.85890: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 13271 1727203817.85901: variable 'omit' from source: magic vars 13271 1727203817.85951: variable 'omit' from source: magic vars 13271 1727203817.85995: variable 'omit' from source: magic vars 13271 1727203817.86032: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203817.86138: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203817.86142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203817.86145: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203817.86147: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203817.86164: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203817.86172: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203817.86182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203817.86289: Set connection var ansible_connection to ssh 13271 1727203817.86301: Set connection var ansible_shell_type to sh 13271 1727203817.86315: Set connection var ansible_timeout to 10 13271 1727203817.86325: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203817.86336: Set connection var ansible_pipelining to False 13271 1727203817.86347: Set connection var ansible_shell_executable to /bin/sh 13271 1727203817.86391: variable 'ansible_shell_executable' from source: unknown 13271 1727203817.86399: variable 'ansible_connection' from source: unknown 13271 1727203817.86407: variable 'ansible_module_compression' from source: unknown 13271 1727203817.86414: variable 'ansible_shell_type' from source: unknown 13271 1727203817.86489: variable 'ansible_shell_executable' from source: unknown 13271 1727203817.86492: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203817.86494: variable 'ansible_pipelining' from source: unknown 13271 1727203817.86496: variable 'ansible_timeout' from source: unknown 13271 1727203817.86498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203817.86618: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203817.86634: variable 'omit' from source: magic vars 13271 1727203817.86645: starting attempt loop 13271 1727203817.86653: running the handler 13271 1727203817.86675: _low_level_execute_command(): starting 13271 1727203817.86691: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203817.87415: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203817.87475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203817.87539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203817.87555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203817.87584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203817.87701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203817.89534: stdout chunk (state=3): >>>/root <<< 13271 1727203817.89698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203817.89702: stdout chunk (state=3): >>><<< 13271 1727203817.89704: stderr chunk (state=3): >>><<< 13271 1727203817.89725: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203817.89834: _low_level_execute_command(): starting 13271 1727203817.89838: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406 `" && echo ansible-tmp-1727203817.8974032-13516-59838860768406="` echo /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406 `" ) && sleep 0' 13271 1727203817.90390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203817.90406: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203817.90421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203817.90494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203817.90540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203817.90558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203817.90585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203817.90703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203817.92791: stdout chunk (state=3): >>>ansible-tmp-1727203817.8974032-13516-59838860768406=/root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406 <<< 13271 1727203817.92939: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203817.92950: stdout chunk (state=3): >>><<< 13271 1727203817.92969: stderr chunk (state=3): >>><<< 13271 1727203817.92993: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203817.8974032-13516-59838860768406=/root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203817.93044: variable 'ansible_module_compression' from source: unknown 13271 1727203817.93104: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13271 1727203817.93177: variable 'ansible_facts' from source: unknown 13271 1727203817.93392: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/AnsiballZ_setup.py 13271 1727203817.93625: Sending initial data 13271 1727203817.93628: Sent initial data (153 bytes) 13271 1727203817.94245: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203817.94272: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203817.94290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203817.94378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203817.94413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203817.94436: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203817.94452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203817.94587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203817.96518: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203817.96587: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203817.96682: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpkfv9djqs /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/AnsiballZ_setup.py <<< 13271 1727203817.96693: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/AnsiballZ_setup.py" <<< 13271 1727203817.96768: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpkfv9djqs" to remote "/root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/AnsiballZ_setup.py" <<< 13271 1727203817.98513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203817.98564: stderr chunk (state=3): >>><<< 13271 1727203817.98610: stdout chunk (state=3): >>><<< 13271 1727203817.98614: done transferring module to remote 13271 1727203817.98634: _low_level_execute_command(): starting 13271 1727203817.98643: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/ /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/AnsiballZ_setup.py && sleep 0' 13271 1727203817.99314: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203817.99326: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203817.99338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203817.99357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203817.99456: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203817.99472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203817.99499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203817.99610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.01619: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203818.01623: stdout chunk (state=3): >>><<< 13271 1727203818.01681: stderr chunk (state=3): >>><<< 13271 1727203818.01685: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203818.01687: _low_level_execute_command(): starting 13271 1727203818.01690: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/AnsiballZ_setup.py && sleep 0' 13271 1727203818.02247: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203818.02258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203818.02264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203818.02441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203818.02444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203818.02447: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203818.02449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203818.02451: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203818.02453: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203818.02455: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203818.02457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203818.02459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203818.02464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203818.02466: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203818.02468: stderr chunk (state=3): >>>debug2: match found <<< 13271 1727203818.02470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203818.02472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203818.02474: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203818.02478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203818.02597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.05020: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 13271 1727203818.05044: stdout chunk (state=3): >>>import _imp # builtin <<< 13271 1727203818.05057: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 13271 1727203818.05133: stdout chunk (state=3): >>>import '_io' # <<< 13271 1727203818.05145: stdout chunk (state=3): >>>import 'marshal' # <<< 13271 1727203818.05169: stdout chunk (state=3): >>>import 'posix' # <<< 13271 1727203818.05225: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 13271 1727203818.05247: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 13271 1727203818.05295: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 13271 1727203818.05318: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.05343: stdout chunk (state=3): >>>import '_codecs' # import 'codecs' # <<< 13271 1727203818.05406: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13271 1727203818.05409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd704184d0> <<< 13271 1727203818.05452: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd703e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7041aa50> <<< 13271 1727203818.05486: stdout chunk (state=3): >>>import '_signal' # <<< 13271 1727203818.05514: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 13271 1727203818.05534: stdout chunk (state=3): >>>import 'io' # <<< 13271 1727203818.05573: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 13271 1727203818.05673: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13271 1727203818.05699: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 13271 1727203818.05738: stdout chunk (state=3): >>>import 'os' # <<< 13271 1727203818.05782: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 13271 1727203818.05820: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 13271 1727203818.05826: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 13271 1727203818.05854: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 13271 1727203818.05880: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7022d130> <<< 13271 1727203818.05942: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.05981: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7022e060> <<< 13271 1727203818.05985: stdout chunk (state=3): >>>import 'site' # <<< 13271 1727203818.06020: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13271 1727203818.06438: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 13271 1727203818.06442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 13271 1727203818.06472: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.06487: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13271 1727203818.06524: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13271 1727203818.06555: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 13271 1727203818.06587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 13271 1727203818.06608: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7026bf80> <<< 13271 1727203818.06649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 13271 1727203818.06654: stdout chunk (state=3): >>>import '_operator' # <<< 13271 1727203818.06687: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70280110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 13271 1727203818.06708: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 13271 1727203818.06739: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 13271 1727203818.06796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.06839: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702a3950> <<< 13271 1727203818.06873: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 13271 1727203818.06910: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702a3fe0> <<< 13271 1727203818.06914: stdout chunk (state=3): >>>import '_collections' # <<< 13271 1727203818.06958: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70283bf0> import '_functools' # <<< 13271 1727203818.06998: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702812e0> <<< 13271 1727203818.07089: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70269130> <<< 13271 1727203818.07123: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 13271 1727203818.07152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 13271 1727203818.07159: stdout chunk (state=3): >>>import '_sre' # <<< 13271 1727203818.07180: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 13271 1727203818.07211: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 13271 1727203818.07232: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 13271 1727203818.07263: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702c78f0> <<< 13271 1727203818.07287: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702c6510> <<< 13271 1727203818.07316: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70282390> <<< 13271 1727203818.07337: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702c4d40> <<< 13271 1727203818.07388: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 13271 1727203818.07405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702683b0> <<< 13271 1727203818.07425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 13271 1727203818.07472: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd702f4e00> <<< 13271 1727203818.07483: stdout chunk (state=3): >>>import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f4cb0> <<< 13271 1727203818.07514: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.07526: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd702f50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70266ed0> <<< 13271 1727203818.07562: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.07592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 13271 1727203818.07632: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 13271 1727203818.07648: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f5460> <<< 13271 1727203818.07688: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 13271 1727203818.07721: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f6660> <<< 13271 1727203818.07732: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 13271 1727203818.07764: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 13271 1727203818.07819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 13271 1727203818.07837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70310890> <<< 13271 1727203818.07854: stdout chunk (state=3): >>>import 'errno' # <<< 13271 1727203818.07896: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70311fd0> <<< 13271 1727203818.07930: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 13271 1727203818.07962: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 13271 1727203818.07980: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70312e70> <<< 13271 1727203818.08034: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd703134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd703123c0> <<< 13271 1727203818.08057: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 13271 1727203818.08118: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.08137: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70313e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70313590> <<< 13271 1727203818.08202: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f66c0> <<< 13271 1727203818.08205: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 13271 1727203818.08239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 13271 1727203818.08274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 13271 1727203818.08295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 13271 1727203818.08331: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd7002fd10> <<< 13271 1727203818.08352: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 13271 1727203818.08387: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.08411: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70058860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700585c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70058890> <<< 13271 1727203818.08442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 13271 1727203818.08518: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.08712: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd700591c0> <<< 13271 1727203818.08918: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70059bb0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70058a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7002deb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 13271 1727203818.08929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7005af90> <<< 13271 1727203818.08957: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70059d00> <<< 13271 1727203818.08974: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f6db0> <<< 13271 1727203818.09009: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13271 1727203818.09074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.09096: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 13271 1727203818.09129: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 13271 1727203818.09166: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700832f0> <<< 13271 1727203818.09249: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 13271 1727203818.09252: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.09289: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 13271 1727203818.09292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 13271 1727203818.09328: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700a76e0> <<< 13271 1727203818.09352: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 13271 1727203818.09398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13271 1727203818.09470: stdout chunk (state=3): >>>import 'ntpath' # <<< 13271 1727203818.09488: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd701084d0> <<< 13271 1727203818.09503: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 13271 1727203818.09545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 13271 1727203818.09570: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13271 1727203818.09647: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13271 1727203818.09706: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7010ac30> <<< 13271 1727203818.09786: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd701085f0> <<< 13271 1727203818.09826: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700d14c0> <<< 13271 1727203818.09864: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff195e0> <<< 13271 1727203818.09891: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700a64e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7005bec0> <<< 13271 1727203818.10080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 13271 1727203818.10102: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd700a6840> <<< 13271 1727203818.10372: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_21vx84x7/ansible_setup_payload.zip' # zipimport: zlib available <<< 13271 1727203818.10501: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.10545: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 13271 1727203818.10555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13271 1727203818.10592: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13271 1727203818.10672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13271 1727203818.10720: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff7f2c0> <<< 13271 1727203818.10723: stdout chunk (state=3): >>>import '_typing' # <<< 13271 1727203818.10915: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff621b0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff61340> # zipimport: zlib available <<< 13271 1727203818.10965: stdout chunk (state=3): >>>import 'ansible' # <<< 13271 1727203818.10968: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.11025: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 13271 1727203818.11029: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.12681: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.13761: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff7d160> <<< 13271 1727203818.13794: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.13831: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 13271 1727203818.13857: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13271 1727203818.13886: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.13908: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6ffaebd0> <<< 13271 1727203818.13938: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffae960> <<< 13271 1727203818.13972: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffae270> <<< 13271 1727203818.13999: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13271 1727203818.14060: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffaed80> <<< 13271 1727203818.14068: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff7fce0> import 'atexit' # <<< 13271 1727203818.14089: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6ffaf920> <<< 13271 1727203818.14133: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6ffafb60> <<< 13271 1727203818.14150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13271 1727203818.14212: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 13271 1727203818.14215: stdout chunk (state=3): >>>import '_locale' # <<< 13271 1727203818.14284: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffd8050> import 'pwd' # <<< 13271 1727203818.14298: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 13271 1727203818.14322: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 13271 1727203818.14400: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f92ddf0> <<< 13271 1727203818.14407: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f92fa10> <<< 13271 1727203818.14432: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 13271 1727203818.14446: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13271 1727203818.14501: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f934410> <<< 13271 1727203818.14512: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13271 1727203818.14533: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 13271 1727203818.14567: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f935310> <<< 13271 1727203818.14586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13271 1727203818.14630: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 13271 1727203818.14653: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 13271 1727203818.14713: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f937f50> <<< 13271 1727203818.14760: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f938140> <<< 13271 1727203818.14783: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f9362d0> <<< 13271 1727203818.14798: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13271 1727203818.14838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 13271 1727203818.14874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13271 1727203818.14879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13271 1727203818.15020: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 13271 1727203818.15058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93bf20> <<< 13271 1727203818.15087: stdout chunk (state=3): >>>import '_tokenize' # <<< 13271 1727203818.15142: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93a9f0> <<< 13271 1727203818.15162: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93a750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 13271 1727203818.15184: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13271 1727203818.15246: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93acc0> <<< 13271 1727203818.15289: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f9367e0> <<< 13271 1727203818.15311: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f980170> <<< 13271 1727203818.15346: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f980290> <<< 13271 1727203818.15366: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 13271 1727203818.15418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 13271 1727203818.15420: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 13271 1727203818.15460: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f981e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f981c10> <<< 13271 1727203818.15480: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 13271 1727203818.15520: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 13271 1727203818.15578: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.15581: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f9843b0> <<< 13271 1727203818.15619: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f982540> <<< 13271 1727203818.15621: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13271 1727203818.15651: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.15688: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 13271 1727203818.15691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 13271 1727203818.15711: stdout chunk (state=3): >>>import '_string' # <<< 13271 1727203818.15746: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f987b90> <<< 13271 1727203818.15882: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f984560> <<< 13271 1727203818.15952: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f988c50> <<< 13271 1727203818.15983: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f988bf0> <<< 13271 1727203818.16038: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f988ec0> <<< 13271 1727203818.16072: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f980590> <<< 13271 1727203818.16090: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 13271 1727203818.16115: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 13271 1727203818.16128: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 13271 1727203818.16167: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.16195: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f814650> <<< 13271 1727203818.16350: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.16365: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f8158e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f98ade0> <<< 13271 1727203818.16419: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f98b9b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f98a9c0> <<< 13271 1727203818.16461: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 13271 1727203818.16464: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.16561: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.16735: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.16755: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 13271 1727203818.16929: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.16965: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.17563: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.18172: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 13271 1727203818.18201: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 13271 1727203818.18233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.18299: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so'<<< 13271 1727203818.18302: stdout chunk (state=3): >>> import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f819af0> <<< 13271 1727203818.18388: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13271 1727203818.18402: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81a870> <<< 13271 1727203818.18425: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f815c10> <<< 13271 1727203818.18484: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 13271 1727203818.18487: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.18518: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.18522: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 13271 1727203818.18531: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.18697: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.18867: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 13271 1727203818.18900: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81af90> # zipimport: zlib available <<< 13271 1727203818.19405: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.19902: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.19979: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.20057: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 13271 1727203818.20071: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.20109: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.20150: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 13271 1727203818.20229: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.20323: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 13271 1727203818.20329: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.20372: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 13271 1727203818.20407: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.20502: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 13271 1727203818.20748: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.20952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13271 1727203818.21041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 13271 1727203818.21044: stdout chunk (state=3): >>>import '_ast' # <<< 13271 1727203818.21134: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81bb00> <<< 13271 1727203818.21137: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.21212: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.21379: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 13271 1727203818.21383: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available <<< 13271 1727203818.21403: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.21432: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 13271 1727203818.21483: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.21523: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.21590: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.21664: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13271 1727203818.21715: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.21808: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f8265a0> <<< 13271 1727203818.21846: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f822f60> <<< 13271 1727203818.21879: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 13271 1727203818.21904: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.21959: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22024: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22055: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22100: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.22128: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13271 1727203818.22172: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13271 1727203818.22177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13271 1727203818.22258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 13271 1727203818.22276: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 13271 1727203818.22292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 13271 1727203818.22347: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f90ed50> <<< 13271 1727203818.22390: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffdea20> <<< 13271 1727203818.22483: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f826360> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81cb90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 13271 1727203818.22502: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22532: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22556: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 13271 1727203818.22607: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 13271 1727203818.22649: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203818.22669: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 13271 1727203818.22729: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22805: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22820: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22837: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22878: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22921: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.22963: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23012: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 13271 1727203818.23015: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23094: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23172: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23190: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23240: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 13271 1727203818.23258: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23427: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23604: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23641: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.23705: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.23741: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 13271 1727203818.23761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 13271 1727203818.23779: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 13271 1727203818.23804: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 13271 1727203818.23837: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b6b70> <<< 13271 1727203818.23866: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 13271 1727203818.23892: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 13271 1727203818.23907: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 13271 1727203818.23945: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 13271 1727203818.23982: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 13271 1727203818.23991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 13271 1727203818.24026: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f468500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.24049: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f468860> <<< 13271 1727203818.24109: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8a0a70> <<< 13271 1727203818.24136: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b7680> <<< 13271 1727203818.24164: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b5250> <<< 13271 1727203818.24180: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b4e00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 13271 1727203818.24266: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 13271 1727203818.24282: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 13271 1727203818.24312: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 13271 1727203818.24346: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f46b7d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f46b080> <<< 13271 1727203818.24375: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f46b260> <<< 13271 1727203818.24403: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f46a4b0> <<< 13271 1727203818.24425: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 13271 1727203818.24611: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 13271 1727203818.24614: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f46b950> <<< 13271 1727203818.24616: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 13271 1727203818.24645: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 13271 1727203818.24680: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f4ca450> <<< 13271 1727203818.24713: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4c8470> <<< 13271 1727203818.24745: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b4f50> import 'ansible.module_utils.facts.timeout' # <<< 13271 1727203818.24807: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 13271 1727203818.24900: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203818.24997: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 13271 1727203818.25011: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.25062: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available <<< 13271 1727203818.25083: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 13271 1727203818.25132: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203818.25156: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 13271 1727203818.25192: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.25225: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.25281: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 13271 1727203818.25332: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.25380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 13271 1727203818.25443: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.25505: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.25565: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.25627: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 13271 1727203818.25645: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26148: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26615: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 13271 1727203818.26626: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26687: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26736: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26777: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26823: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 13271 1727203818.26847: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26863: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26897: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 13271 1727203818.26907: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.26956: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 13271 1727203818.27045: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27069: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27099: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 13271 1727203818.27122: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27158: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27180: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 13271 1727203818.27194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27266: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27372: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 13271 1727203818.27377: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 13271 1727203818.27409: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4ca750> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 13271 1727203818.27439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 13271 1727203818.27580: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4cb320> import 'ansible.module_utils.facts.system.local' # <<< 13271 1727203818.27596: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27649: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 13271 1727203818.27732: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27819: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 13271 1727203818.27927: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.27987: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.28070: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 13271 1727203818.28118: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.28167: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 13271 1727203818.28231: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 13271 1727203818.28308: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.28372: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f506720> <<< 13271 1727203818.28581: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4f64e0> import 'ansible.module_utils.facts.system.python' # <<< 13271 1727203818.28603: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.28644: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.28719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 13271 1727203818.28805: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.28887: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.29084: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.29302: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 13271 1727203818.29341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 13271 1727203818.29365: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 13271 1727203818.29387: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.29418: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f51e1b0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f51ddf0> <<< 13271 1727203818.29450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 13271 1727203818.29467: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.29515: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.29542: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 13271 1727203818.29562: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.29732: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.29879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 13271 1727203818.29897: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30007: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30111: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30143: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 13271 1727203818.30231: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203818.30245: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30402: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30557: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 13271 1727203818.30561: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30681: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30813: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 13271 1727203818.30818: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30842: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.30888: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.31472: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32033: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 13271 1727203818.32036: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32150: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32265: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 13271 1727203818.32285: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32372: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32481: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 13271 1727203818.32501: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32648: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32817: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 13271 1727203818.32822: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32857: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 13271 1727203818.32863: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32920: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.32950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 13271 1727203818.32964: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33272: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 13271 1727203818.33384: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 13271 1727203818.33613: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33637: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33688: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 13271 1727203818.33691: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33719: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33748: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 13271 1727203818.33751: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33930: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available <<< 13271 1727203818.33934: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.33965: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 13271 1727203818.34027: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34082: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 13271 1727203818.34099: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34153: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 13271 1727203818.34229: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34513: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 13271 1727203818.34804: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34859: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34937: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 13271 1727203818.34941: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.34970: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35007: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 13271 1727203818.35022: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35053: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35100: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 13271 1727203818.35103: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35131: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35170: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 13271 1727203818.35189: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35258: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35359: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 13271 1727203818.35386: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35389: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # <<< 13271 1727203818.35420: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35437: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35488: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 13271 1727203818.35537: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35541: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35551: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35584: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35637: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35708: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35795: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 13271 1727203818.35805: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 13271 1727203818.35851: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.35909: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 13271 1727203818.35912: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36126: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36332: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 13271 1727203818.36354: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36399: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36453: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 13271 1727203818.36456: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36511: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 13271 1727203818.36570: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36644: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36750: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 13271 1727203818.36770: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36842: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.36931: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 13271 1727203818.37022: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.38043: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 13271 1727203818.38082: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 13271 1727203818.38112: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 13271 1727203818.38145: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f317a40> <<< 13271 1727203818.38152: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f314170> <<< 13271 1727203818.38216: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f316c00> <<< 13271 1727203818.38595: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "18", "epoch": "1727203818", "epoch_int": "1727203818", "date": "2024-09-24", "time": "14:50:18", "iso8601_micro": "2024-09-24T18:50:18.373762Z", "iso8601": "2024-09-24T18:50:18Z", "iso8601_basic": "20240924T145018373762", "iso8601_basic_short": "20240924T145018", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version"<<< 13271 1727203818.38600: stdout chunk (state=3): >>>: "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13271 1727203818.39270: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 13271 1727203818.39281: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp <<< 13271 1727203818.39304: stdout chunk (state=3): >>># cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 <<< 13271 1727203818.39345: stdout chunk (state=3): >>># cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 13271 1727203818.39383: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder <<< 13271 1727203818.39390: stdout chunk (state=3): >>># cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap <<< 13271 1727203818.39447: stdout chunk (state=3): >>># cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes <<< 13271 1727203818.39450: stdout chunk (state=3): >>># destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing <<< 13271 1727203818.39456: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info <<< 13271 1727203818.39467: stdout chunk (state=3): >>># destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ <<< 13271 1727203818.39500: stdout chunk (state=3): >>># destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass <<< 13271 1727203818.39568: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr <<< 13271 1727203818.39571: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base <<< 13271 1727203818.39597: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 13271 1727203818.40099: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 13271 1727203818.40102: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 13271 1727203818.40150: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 13271 1727203818.40153: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 13271 1727203818.40158: stdout chunk (state=3): >>># destroy zipfile._path <<< 13271 1727203818.40189: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 13271 1727203818.40219: stdout chunk (state=3): >>># destroy ntpath <<< 13271 1727203818.40242: stdout chunk (state=3): >>># destroy importlib # destroy zipimport <<< 13271 1727203818.40287: stdout chunk (state=3): >>># destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 13271 1727203818.40316: stdout chunk (state=3): >>># destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 13271 1727203818.40364: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 13271 1727203818.40383: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 13271 1727203818.40449: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal <<< 13271 1727203818.40458: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 13271 1727203818.40511: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process <<< 13271 1727203818.40517: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors <<< 13271 1727203818.40563: stdout chunk (state=3): >>># destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime <<< 13271 1727203818.40566: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 13271 1727203818.40603: stdout chunk (state=3): >>># destroy _ssl <<< 13271 1727203818.40620: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd <<< 13271 1727203818.40658: stdout chunk (state=3): >>># destroy termios # destroy errno # destroy json <<< 13271 1727203818.40692: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 13271 1727203818.40702: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 13271 1727203818.40761: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser <<< 13271 1727203818.40799: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 13271 1727203818.40849: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math <<< 13271 1727203818.40905: stdout chunk (state=3): >>># cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 13271 1727203818.40911: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 13271 1727203818.40967: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 13271 1727203818.40974: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 13271 1727203818.40978: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 13271 1727203818.41022: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon<<< 13271 1727203818.41025: stdout chunk (state=3): >>> # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13271 1727203818.41250: stdout chunk (state=3): >>># destroy sys.monitoring <<< 13271 1727203818.41254: stdout chunk (state=3): >>># destroy _socket <<< 13271 1727203818.41266: stdout chunk (state=3): >>># destroy _collections <<< 13271 1727203818.41299: stdout chunk (state=3): >>># destroy platform # destroy _uuid <<< 13271 1727203818.41302: stdout chunk (state=3): >>># destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 13271 1727203818.41341: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 13271 1727203818.41401: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 13271 1727203818.41404: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 13271 1727203818.41434: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 13271 1727203818.41564: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 13271 1727203818.41622: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 13271 1727203818.41625: stdout chunk (state=3): >>># destroy _hashlib <<< 13271 1727203818.41643: stdout chunk (state=3): >>># destroy _operator # destroy _sre # destroy _string # destroy re <<< 13271 1727203818.41690: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 13271 1727203818.41700: stdout chunk (state=3): >>># clear sys.audit hooks <<< 13271 1727203818.42117: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203818.42180: stdout chunk (state=3): >>><<< 13271 1727203818.42183: stderr chunk (state=3): >>><<< 13271 1727203818.42491: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd704184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd703e7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7041aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7022d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7022e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7026bf80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70280110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702a3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702a3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70283bf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702812e0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70269130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702c78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702c6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70282390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702c4d40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702683b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd702f4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f4cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd702f50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70266ed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f5460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f6660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70310890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70311fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70312e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd703134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd703123c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70313e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70313590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd7002fd10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70058860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700585c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70058890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd700591c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd70059bb0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70058a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7002deb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7005af90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd70059d00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd702f6db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700832f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700a76e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd701084d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7010ac30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd701085f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700d14c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff195e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd700a64e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd7005bec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd700a6840> # zipimport: found 103 names in '/tmp/ansible_setup_payload_21vx84x7/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff7f2c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff621b0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff61340> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff7d160> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6ffaebd0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffae960> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffae270> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffaed80> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ff7fce0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6ffaf920> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6ffafb60> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffd8050> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f92ddf0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f92fa10> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f934410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f935310> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f937f50> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f938140> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f9362d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93bf20> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93a9f0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93a750> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f93acc0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f9367e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f980170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f980290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f981e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f981c10> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f9843b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f982540> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f987b90> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f984560> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f988c50> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f988bf0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f988ec0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f980590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f814650> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f8158e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f98ade0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f98b9b0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f98a9c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f819af0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81a870> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f815c10> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81af90> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81bb00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f8265a0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f822f60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f90ed50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6ffdea20> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f826360> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f81cb90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b6b70> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f468500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f468860> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8a0a70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b7680> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b5250> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b4e00> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f46b7d0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f46b080> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f46b260> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f46a4b0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f46b950> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f4ca450> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4c8470> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f8b4f50> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4ca750> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4cb320> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f506720> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f4f64e0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f51e1b0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f51ddf0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd6f317a40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f314170> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd6f316c00> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "18", "epoch": "1727203818", "epoch_int": "1727203818", "date": "2024-09-24", "time": "14:50:18", "iso8601_micro": "2024-09-24T18:50:18.373762Z", "iso8601": "2024-09-24T18:50:18Z", "iso8601_basic": "20240924T145018373762", "iso8601_basic_short": "20240924T145018", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 13271 1727203818.43748: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203818.43751: _low_level_execute_command(): starting 13271 1727203818.43754: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203817.8974032-13516-59838860768406/ > /dev/null 2>&1 && sleep 0' 13271 1727203818.43863: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203818.43879: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203818.43895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203818.43915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203818.43933: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203818.43945: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203818.43965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203818.44055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203818.44087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203818.44195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.46221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203818.46231: stdout chunk (state=3): >>><<< 13271 1727203818.46241: stderr chunk (state=3): >>><<< 13271 1727203818.46269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203818.46282: handler run complete 13271 1727203818.46328: variable 'ansible_facts' from source: unknown 13271 1727203818.46472: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203818.46513: variable 'ansible_facts' from source: unknown 13271 1727203818.46579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203818.46641: attempt loop complete, returning result 13271 1727203818.46650: _execute() done 13271 1727203818.46659: dumping result to json 13271 1727203818.46696: done dumping result, returning 13271 1727203818.46699: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [028d2410-947f-2a40-12ba-0000000000dd] 13271 1727203818.46702: sending task result for task 028d2410-947f-2a40-12ba-0000000000dd 13271 1727203818.47181: done sending task result for task 028d2410-947f-2a40-12ba-0000000000dd 13271 1727203818.47185: WORKER PROCESS EXITING ok: [managed-node1] 13271 1727203818.47286: no more pending results, returning what we have 13271 1727203818.47290: results queue empty 13271 1727203818.47291: checking for any_errors_fatal 13271 1727203818.47292: done checking for any_errors_fatal 13271 1727203818.47293: checking for max_fail_percentage 13271 1727203818.47299: done checking for max_fail_percentage 13271 1727203818.47300: checking to see if all hosts have failed and the running result is not ok 13271 1727203818.47301: done checking to see if all hosts have failed 13271 1727203818.47301: getting the remaining hosts for this loop 13271 1727203818.47302: done getting the remaining hosts for this loop 13271 1727203818.47306: getting the next task for host managed-node1 13271 1727203818.47314: done getting next task for host managed-node1 13271 1727203818.47317: ^ task is: TASK: Check if system is ostree 13271 1727203818.47319: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203818.47323: getting variables 13271 1727203818.47324: in VariableManager get_vars() 13271 1727203818.47351: Calling all_inventory to load vars for managed-node1 13271 1727203818.47354: Calling groups_inventory to load vars for managed-node1 13271 1727203818.47357: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203818.47367: Calling all_plugins_play to load vars for managed-node1 13271 1727203818.47369: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203818.47373: Calling groups_plugins_play to load vars for managed-node1 13271 1727203818.47580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203818.47773: done with get_vars() 13271 1727203818.47787: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:50:18 -0400 (0:00:00.667) 0:00:02.122 ***** 13271 1727203818.47885: entering _queue_task() for managed-node1/stat 13271 1727203818.48139: worker is 1 (out of 1 available) 13271 1727203818.48151: exiting _queue_task() for managed-node1/stat 13271 1727203818.48167: done queuing things up, now waiting for results queue to drain 13271 1727203818.48169: waiting for pending results... 13271 1727203818.48395: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 13271 1727203818.48479: in run() - task 028d2410-947f-2a40-12ba-0000000000df 13271 1727203818.48497: variable 'ansible_search_path' from source: unknown 13271 1727203818.48500: variable 'ansible_search_path' from source: unknown 13271 1727203818.48533: calling self._execute() 13271 1727203818.48609: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203818.48618: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203818.48625: variable 'omit' from source: magic vars 13271 1727203818.49107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203818.49342: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203818.49395: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203818.49449: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203818.49494: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203818.49580: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203818.49608: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203818.49680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203818.49683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203818.49786: Evaluated conditional (not __network_is_ostree is defined): True 13271 1727203818.49796: variable 'omit' from source: magic vars 13271 1727203818.49830: variable 'omit' from source: magic vars 13271 1727203818.49869: variable 'omit' from source: magic vars 13271 1727203818.49905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203818.49937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203818.49964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203818.50001: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203818.50181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203818.50184: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203818.50187: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203818.50188: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203818.50191: Set connection var ansible_connection to ssh 13271 1727203818.50193: Set connection var ansible_shell_type to sh 13271 1727203818.50194: Set connection var ansible_timeout to 10 13271 1727203818.50196: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203818.50198: Set connection var ansible_pipelining to False 13271 1727203818.50200: Set connection var ansible_shell_executable to /bin/sh 13271 1727203818.50219: variable 'ansible_shell_executable' from source: unknown 13271 1727203818.50227: variable 'ansible_connection' from source: unknown 13271 1727203818.50238: variable 'ansible_module_compression' from source: unknown 13271 1727203818.50245: variable 'ansible_shell_type' from source: unknown 13271 1727203818.50251: variable 'ansible_shell_executable' from source: unknown 13271 1727203818.50263: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203818.50278: variable 'ansible_pipelining' from source: unknown 13271 1727203818.50287: variable 'ansible_timeout' from source: unknown 13271 1727203818.50295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203818.50449: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203818.50468: variable 'omit' from source: magic vars 13271 1727203818.50481: starting attempt loop 13271 1727203818.50488: running the handler 13271 1727203818.50506: _low_level_execute_command(): starting 13271 1727203818.50519: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203818.51394: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203818.51522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203818.51535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203818.51640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.53415: stdout chunk (state=3): >>>/root <<< 13271 1727203818.53578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203818.53582: stdout chunk (state=3): >>><<< 13271 1727203818.53584: stderr chunk (state=3): >>><<< 13271 1727203818.53689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203818.53701: _low_level_execute_command(): starting 13271 1727203818.53704: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336 `" && echo ansible-tmp-1727203818.5360866-13540-242152280958336="` echo /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336 `" ) && sleep 0' 13271 1727203818.54281: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203818.54302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203818.54363: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203818.54382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203818.54433: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203818.54522: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.56637: stdout chunk (state=3): >>>ansible-tmp-1727203818.5360866-13540-242152280958336=/root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336 <<< 13271 1727203818.56804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203818.56808: stdout chunk (state=3): >>><<< 13271 1727203818.56811: stderr chunk (state=3): >>><<< 13271 1727203818.56829: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203818.5360866-13540-242152280958336=/root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203818.56917: variable 'ansible_module_compression' from source: unknown 13271 1727203818.56966: ANSIBALLZ: Using lock for stat 13271 1727203818.56974: ANSIBALLZ: Acquiring lock 13271 1727203818.56984: ANSIBALLZ: Lock acquired: 140497830697136 13271 1727203818.56992: ANSIBALLZ: Creating module 13271 1727203818.70283: ANSIBALLZ: Writing module into payload 13271 1727203818.70425: ANSIBALLZ: Writing module 13271 1727203818.70437: ANSIBALLZ: Renaming module 13271 1727203818.70480: ANSIBALLZ: Done creating module 13271 1727203818.70484: variable 'ansible_facts' from source: unknown 13271 1727203818.70559: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/AnsiballZ_stat.py 13271 1727203818.70995: Sending initial data 13271 1727203818.70998: Sent initial data (153 bytes) 13271 1727203818.72187: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203818.72345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203818.72358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203818.72459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.74229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 13271 1727203818.74242: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203818.74316: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203818.74412: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpq6tq8hf9 /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/AnsiballZ_stat.py <<< 13271 1727203818.74415: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/AnsiballZ_stat.py" <<< 13271 1727203818.74494: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpq6tq8hf9" to remote "/root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/AnsiballZ_stat.py" <<< 13271 1727203818.75513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203818.75517: stderr chunk (state=3): >>><<< 13271 1727203818.75520: stdout chunk (state=3): >>><<< 13271 1727203818.75523: done transferring module to remote 13271 1727203818.75525: _low_level_execute_command(): starting 13271 1727203818.75527: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/ /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/AnsiballZ_stat.py && sleep 0' 13271 1727203818.76070: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203818.76088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203818.76106: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203818.76123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203818.76141: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203818.76154: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203818.76250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203818.76274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203818.76295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203818.76405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.78430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203818.78471: stderr chunk (state=3): >>><<< 13271 1727203818.78492: stdout chunk (state=3): >>><<< 13271 1727203818.78514: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203818.78523: _low_level_execute_command(): starting 13271 1727203818.78532: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/AnsiballZ_stat.py && sleep 0' 13271 1727203818.79174: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203818.79194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203818.79211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203818.79229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203818.79247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203818.79259: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203818.79285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203818.79306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203818.79390: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203818.79409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203818.79430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203818.79623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203818.81984: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 13271 1727203818.82011: stdout chunk (state=3): >>>import _imp # builtin <<< 13271 1727203818.82044: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 13271 1727203818.82104: stdout chunk (state=3): >>>import '_io' # <<< 13271 1727203818.82122: stdout chunk (state=3): >>>import 'marshal' # <<< 13271 1727203818.82146: stdout chunk (state=3): >>>import 'posix' # <<< 13271 1727203818.82186: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 13271 1727203818.82208: stdout chunk (state=3): >>> # installing zipimport hook <<< 13271 1727203818.82222: stdout chunk (state=3): >>>import 'time' # <<< 13271 1727203818.82247: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 13271 1727203818.82279: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 13271 1727203818.82299: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.82313: stdout chunk (state=3): >>>import '_codecs' # <<< 13271 1727203818.82334: stdout chunk (state=3): >>>import 'codecs' # <<< 13271 1727203818.82371: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 13271 1727203818.82402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 13271 1727203818.82428: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcbe7b30> <<< 13271 1727203818.82464: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 13271 1727203818.82492: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcc1aa50> import '_signal' # <<< 13271 1727203818.82520: stdout chunk (state=3): >>>import '_abc' # <<< 13271 1727203818.82537: stdout chunk (state=3): >>>import 'abc' # <<< 13271 1727203818.82567: stdout chunk (state=3): >>>import 'io' # <<< 13271 1727203818.82593: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 13271 1727203818.82688: stdout chunk (state=3): >>>import '_collections_abc' # <<< 13271 1727203818.82701: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 13271 1727203818.82741: stdout chunk (state=3): >>>import 'os' # <<< 13271 1727203818.82761: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 13271 1727203818.82785: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 13271 1727203818.82818: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 13271 1727203818.82852: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 13271 1727203818.82867: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca2d130> <<< 13271 1727203818.82939: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.82978: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca2e060> <<< 13271 1727203818.82982: stdout chunk (state=3): >>>import 'site' # <<< 13271 1727203818.83022: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 13271 1727203818.83262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 13271 1727203818.83280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 13271 1727203818.83306: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 13271 1727203818.83327: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.83345: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 13271 1727203818.83494: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 13271 1727203818.83496: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca6bf80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # <<< 13271 1727203818.83717: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca80110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 13271 1727203818.83721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaa3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaa3fe0> <<< 13271 1727203818.83742: stdout chunk (state=3): >>>import '_collections' # <<< 13271 1727203818.83784: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca83bf0> <<< 13271 1727203818.83808: stdout chunk (state=3): >>>import '_functools' # <<< 13271 1727203818.83892: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca812e0> <<< 13271 1727203818.83930: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca69130> <<< 13271 1727203818.83954: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 13271 1727203818.83982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 13271 1727203818.84000: stdout chunk (state=3): >>>import '_sre' # <<< 13271 1727203818.84020: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 13271 1727203818.84049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 13271 1727203818.84078: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 13271 1727203818.84112: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcac78f0> <<< 13271 1727203818.84132: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcac6510> <<< 13271 1727203818.84158: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca82390> <<< 13271 1727203818.84173: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcac4d40> <<< 13271 1727203818.84217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 13271 1727203818.84269: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca683b0> <<< 13271 1727203818.84297: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 13271 1727203818.84312: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcaf4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf4cb0> <<< 13271 1727203818.84357: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.84370: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcaf50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca66ed0> <<< 13271 1727203818.84398: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.84419: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 13271 1727203818.84479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 13271 1727203818.84490: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf5460> import 'importlib.machinery' # <<< 13271 1727203818.84528: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 13271 1727203818.84552: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf6660> import 'importlib.util' # <<< 13271 1727203818.84596: stdout chunk (state=3): >>>import 'runpy' # <<< 13271 1727203818.84599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 13271 1727203818.84626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 13271 1727203818.84664: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb10890> <<< 13271 1727203818.84716: stdout chunk (state=3): >>>import 'errno' # <<< 13271 1727203818.84745: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb11fd0> <<< 13271 1727203818.84748: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 13271 1727203818.84800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 13271 1727203818.84803: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb12e70> <<< 13271 1727203818.84849: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.84884: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb123c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 13271 1727203818.84898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 13271 1727203818.84948: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb13e60> <<< 13271 1727203818.84995: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb13590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf66c0> <<< 13271 1727203818.85022: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 13271 1727203818.85059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 13271 1727203818.85109: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 13271 1727203818.85132: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8afd10> <<< 13271 1727203818.85172: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 13271 1727203818.85195: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d8860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8d85c0> <<< 13271 1727203818.85220: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d8890> <<< 13271 1727203818.85249: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 13271 1727203818.85332: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.85472: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.85484: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d91c0> <<< 13271 1727203818.85626: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d9bb0> <<< 13271 1727203818.85652: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8d8a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8adeb0> <<< 13271 1727203818.85686: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 13271 1727203818.85710: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 13271 1727203818.85742: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 13271 1727203818.85753: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8daf90> <<< 13271 1727203818.85788: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8d9d00> <<< 13271 1727203818.85805: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf6db0> <<< 13271 1727203818.85829: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 13271 1727203818.85896: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.85920: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 13271 1727203818.85954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 13271 1727203818.85985: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9032f0> <<< 13271 1727203818.86041: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 13271 1727203818.86083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.86086: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 13271 1727203818.86110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 13271 1727203818.86140: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9276e0> <<< 13271 1727203818.86167: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 13271 1727203818.86207: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 13271 1727203818.86274: stdout chunk (state=3): >>>import 'ntpath' # <<< 13271 1727203818.86314: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9884d0> <<< 13271 1727203818.86317: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 13271 1727203818.86351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 13271 1727203818.86379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 13271 1727203818.86422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 13271 1727203818.86513: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc98ac30> <<< 13271 1727203818.86599: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9885f0> <<< 13271 1727203818.86632: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9514c0> <<< 13271 1727203818.86679: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc78d5b0> <<< 13271 1727203818.86704: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9264e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8dbec0> <<< 13271 1727203818.86811: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 13271 1727203818.86835: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7effdc926840> <<< 13271 1727203818.87013: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_ufr0yleu/ansible_stat_payload.zip' # zipimport: zlib available <<< 13271 1727203818.87157: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.87201: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 13271 1727203818.87204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 13271 1727203818.87257: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 13271 1727203818.87337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 13271 1727203818.87389: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7df290> import '_typing' # <<< 13271 1727203818.87598: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7c2180> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7c1310> <<< 13271 1727203818.87606: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.87664: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 13271 1727203818.87668: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.87694: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.87707: stdout chunk (state=3): >>>import 'ansible.module_utils' # # zipimport: zlib available <<< 13271 1727203818.89212: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.90479: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7dcfe0> <<< 13271 1727203818.90502: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.90520: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 13271 1727203818.90544: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 13271 1727203818.90552: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 13271 1727203818.90588: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.90597: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc80ab70> <<< 13271 1727203818.90628: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80a900> <<< 13271 1727203818.90667: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80a210> <<< 13271 1727203818.90694: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 13271 1727203818.90697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 13271 1727203818.90743: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80ad20> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7dff20> import 'atexit' # <<< 13271 1727203818.90777: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc80b8c0> <<< 13271 1727203818.90808: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc80ba40> <<< 13271 1727203818.90845: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 13271 1727203818.90905: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 13271 1727203818.90919: stdout chunk (state=3): >>>import '_locale' # <<< 13271 1727203818.90980: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80bf80> <<< 13271 1727203818.90984: stdout chunk (state=3): >>>import 'pwd' # <<< 13271 1727203818.90997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 13271 1727203818.91021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 13271 1727203818.91068: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc115ca0> <<< 13271 1727203818.91101: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc1178c0> <<< 13271 1727203818.91118: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 13271 1727203818.91142: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 13271 1727203818.91185: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc1182c0> <<< 13271 1727203818.91207: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 13271 1727203818.91239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 13271 1727203818.91250: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc119160> <<< 13271 1727203818.91278: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 13271 1727203818.91329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 13271 1727203818.91340: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 13271 1727203818.91405: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc11be60> <<< 13271 1727203818.91456: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb12de0> <<< 13271 1727203818.91469: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc11a180> <<< 13271 1727203818.91483: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 13271 1727203818.91523: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 13271 1727203818.91552: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 13271 1727203818.91564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 13271 1727203818.91601: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 13271 1727203818.91638: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc127d10> <<< 13271 1727203818.91659: stdout chunk (state=3): >>>import '_tokenize' # <<< 13271 1727203818.91733: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc1267e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc126540> <<< 13271 1727203818.91763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 13271 1727203818.91859: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc126ab0> <<< 13271 1727203818.91873: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc11a660> <<< 13271 1727203818.91904: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc16bf50> <<< 13271 1727203818.92029: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16b980> <<< 13271 1727203818.92278: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc16db20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16d8e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc16ffe0> <<< 13271 1727203818.92285: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16e210> <<< 13271 1727203818.92302: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 13271 1727203818.92374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.92399: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 13271 1727203818.92402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 13271 1727203818.92451: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc173890> <<< 13271 1727203818.92589: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc170260> <<< 13271 1727203818.92658: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc174950> <<< 13271 1727203818.92697: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.92708: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc174aa0> <<< 13271 1727203818.92739: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc174b60> <<< 13271 1727203818.92771: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16c170> <<< 13271 1727203818.92794: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 13271 1727203818.92809: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 13271 1727203818.92834: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 13271 1727203818.92858: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.92900: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc000320> <<< 13271 1727203818.93072: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 13271 1727203818.93078: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc001670> <<< 13271 1727203818.93121: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc176ab0> <<< 13271 1727203818.93124: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc177e60> <<< 13271 1727203818.93127: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc1766f0> <<< 13271 1727203818.93163: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # <<< 13271 1727203818.93178: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.93271: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.93390: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.93396: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 13271 1727203818.93411: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.93437: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 13271 1727203818.93556: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.93697: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.94283: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.94855: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 13271 1727203818.94881: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 13271 1727203818.94896: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 13271 1727203818.94923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.94983: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc0058e0> <<< 13271 1727203818.95074: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 13271 1727203818.95101: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0066c0> <<< 13271 1727203818.95104: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0018b0> <<< 13271 1727203818.95166: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 13271 1727203818.95194: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.95197: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.95222: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 13271 1727203818.95388: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.95550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 13271 1727203818.95590: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc006420> # zipimport: zlib available <<< 13271 1727203818.96193: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.96558: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.96641: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.96755: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 13271 1727203818.96766: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.96810: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 13271 1727203818.96828: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.96894: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.96995: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 13271 1727203818.96999: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.97025: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 13271 1727203818.97071: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.97114: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 13271 1727203818.97133: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.97384: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.97642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 13271 1727203818.97723: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 13271 1727203818.97727: stdout chunk (state=3): >>>import '_ast' # <<< 13271 1727203818.97813: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0079e0> <<< 13271 1727203818.97816: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.97900: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.97980: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 13271 1727203818.98007: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 13271 1727203818.98022: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98069: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98121: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 13271 1727203818.98124: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98158: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98215: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98267: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98343: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 13271 1727203818.98389: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.98494: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc012330> <<< 13271 1727203818.98553: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc00d160> <<< 13271 1727203818.98571: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 13271 1727203818.98588: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98652: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98712: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98746: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.98800: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 13271 1727203818.98814: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 13271 1727203818.98849: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 13271 1727203818.98865: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 13271 1727203818.98944: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 13271 1727203818.98955: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 13271 1727203818.98982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 13271 1727203818.99040: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc102b70> <<< 13271 1727203818.99090: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc846840> <<< 13271 1727203818.99184: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc012150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0049b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 13271 1727203818.99215: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.99256: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.99263: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 13271 1727203818.99276: stdout chunk (state=3): >>>import 'ansible.module_utils.common.sys_info' # <<< 13271 1727203818.99325: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 13271 1727203818.99353: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.99368: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 13271 1727203818.99518: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.99723: stdout chunk (state=3): >>># zipimport: zlib available <<< 13271 1727203818.99852: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 13271 1727203819.00331: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 13271 1727203819.00335: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site <<< 13271 1727203819.00356: stdout chunk (state=3): >>># destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 13271 1727203819.00416: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl <<< 13271 1727203819.00428: stdout chunk (state=3): >>># cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections<<< 13271 1727203819.00483: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro<<< 13271 1727203819.00493: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 13271 1727203819.00849: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 13271 1727203819.00887: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 <<< 13271 1727203819.00913: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 13271 1727203819.00964: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ <<< 13271 1727203819.00970: stdout chunk (state=3): >>># destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 13271 1727203819.01018: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog <<< 13271 1727203819.01060: stdout chunk (state=3): >>># destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime <<< 13271 1727203819.01089: stdout chunk (state=3): >>># destroy selinux # destroy shutil <<< 13271 1727203819.01103: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 13271 1727203819.01169: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 13271 1727203819.01176: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc <<< 13271 1727203819.01236: stdout chunk (state=3): >>># cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 13271 1727203819.01239: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref <<< 13271 1727203819.01246: stdout chunk (state=3): >>># cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 13271 1727203819.01351: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 13271 1727203819.01360: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 13271 1727203819.01363: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs <<< 13271 1727203819.01371: stdout chunk (state=3): >>># cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 13271 1727203819.01373: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 13271 1727203819.01377: stdout chunk (state=3): >>># cleanup[3] wiping builtins <<< 13271 1727203819.01379: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 <<< 13271 1727203819.01399: stdout chunk (state=3): >>># destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 13271 1727203819.01561: stdout chunk (state=3): >>># destroy sys.monitoring <<< 13271 1727203819.01595: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 13271 1727203819.01639: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib <<< 13271 1727203819.01642: stdout chunk (state=3): >>># destroy copyreg # destroy contextlib <<< 13271 1727203819.01673: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error <<< 13271 1727203819.01718: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 13271 1727203819.01810: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs <<< 13271 1727203819.01852: stdout chunk (state=3): >>># destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 13271 1727203819.01888: stdout chunk (state=3): >>># destroy _hashlib # destroy _operator # destroy _string # destroy re <<< 13271 1727203819.01906: stdout chunk (state=3): >>># destroy itertools <<< 13271 1727203819.01940: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 13271 1727203819.02381: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203819.02384: stdout chunk (state=3): >>><<< 13271 1727203819.02386: stderr chunk (state=3): >>><<< 13271 1727203819.02447: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca2d130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca2e060> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca6bf80> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca80110> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaa3950> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaa3fe0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca83bf0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca812e0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca69130> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcac78f0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcac6510> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca82390> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcac4d40> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf4950> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca683b0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcaf4e00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf4cb0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcaf50a0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdca66ed0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf5790> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf5460> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf6660> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb10890> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb11fd0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb12e70> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb134a0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb123c0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb13e60> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcb13590> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf66c0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8afd10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d8860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8d85c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d8890> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d91c0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc8d9bb0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8d8a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8adeb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8daf90> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8d9d00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdcaf6db0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9032f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9276e0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9884d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc98ac30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9885f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9514c0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc78d5b0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc9264e0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc8dbec0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7effdc926840> # zipimport: found 30 names in '/tmp/ansible_stat_payload_ufr0yleu/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7df290> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7c2180> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7c1310> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7dcfe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc80ab70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80a900> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80a210> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80ad20> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc7dff20> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc80b8c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc80ba40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc80bf80> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc115ca0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc1178c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc1182c0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc119160> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc11be60> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdcb12de0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc11a180> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc127d10> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc1267e0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc126540> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc126ab0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc11a660> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc16bf50> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16b980> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc16db20> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16d8e0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc16ffe0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16e210> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc173890> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc170260> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc174950> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc174aa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc174b60> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc16c170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc000320> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc001670> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc176ab0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc177e60> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc1766f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc0058e0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0066c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0018b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc006420> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0079e0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7effdc012330> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc00d160> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc102b70> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc846840> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc012150> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7effdc0049b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 13271 1727203819.03547: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203819.03550: _low_level_execute_command(): starting 13271 1727203819.03553: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203818.5360866-13540-242152280958336/ > /dev/null 2>&1 && sleep 0' 13271 1727203819.03555: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203819.03558: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203819.03563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203819.03565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203819.03567: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203819.03569: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203819.03571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.03573: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203819.03575: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203819.03578: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203819.03580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203819.03582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203819.03583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203819.03585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203819.03587: stderr chunk (state=3): >>>debug2: match found <<< 13271 1727203819.03603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.03606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203819.03608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203819.03609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203819.03611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203819.05640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203819.05644: stderr chunk (state=3): >>><<< 13271 1727203819.05646: stdout chunk (state=3): >>><<< 13271 1727203819.05674: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203819.05882: handler run complete 13271 1727203819.05886: attempt loop complete, returning result 13271 1727203819.05889: _execute() done 13271 1727203819.05891: dumping result to json 13271 1727203819.05893: done dumping result, returning 13271 1727203819.05895: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [028d2410-947f-2a40-12ba-0000000000df] 13271 1727203819.05897: sending task result for task 028d2410-947f-2a40-12ba-0000000000df 13271 1727203819.05966: done sending task result for task 028d2410-947f-2a40-12ba-0000000000df 13271 1727203819.05969: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 13271 1727203819.06141: no more pending results, returning what we have 13271 1727203819.06144: results queue empty 13271 1727203819.06145: checking for any_errors_fatal 13271 1727203819.06151: done checking for any_errors_fatal 13271 1727203819.06152: checking for max_fail_percentage 13271 1727203819.06153: done checking for max_fail_percentage 13271 1727203819.06154: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.06155: done checking to see if all hosts have failed 13271 1727203819.06156: getting the remaining hosts for this loop 13271 1727203819.06157: done getting the remaining hosts for this loop 13271 1727203819.06159: getting the next task for host managed-node1 13271 1727203819.06164: done getting next task for host managed-node1 13271 1727203819.06167: ^ task is: TASK: Set flag to indicate system is ostree 13271 1727203819.06169: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.06172: getting variables 13271 1727203819.06173: in VariableManager get_vars() 13271 1727203819.06205: Calling all_inventory to load vars for managed-node1 13271 1727203819.06207: Calling groups_inventory to load vars for managed-node1 13271 1727203819.06210: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.06220: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.06222: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.06224: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.06420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.06614: done with get_vars() 13271 1727203819.06631: done getting variables 13271 1727203819.06734: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.588) 0:00:02.710 ***** 13271 1727203819.06767: entering _queue_task() for managed-node1/set_fact 13271 1727203819.06769: Creating lock for set_fact 13271 1727203819.07188: worker is 1 (out of 1 available) 13271 1727203819.07197: exiting _queue_task() for managed-node1/set_fact 13271 1727203819.07208: done queuing things up, now waiting for results queue to drain 13271 1727203819.07210: waiting for pending results... 13271 1727203819.07345: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 13271 1727203819.07499: in run() - task 028d2410-947f-2a40-12ba-0000000000e0 13271 1727203819.07503: variable 'ansible_search_path' from source: unknown 13271 1727203819.07505: variable 'ansible_search_path' from source: unknown 13271 1727203819.07508: calling self._execute() 13271 1727203819.07569: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.07577: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.07589: variable 'omit' from source: magic vars 13271 1727203819.08133: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203819.08366: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203819.08406: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203819.08437: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203819.08474: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203819.08583: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203819.08586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203819.08598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203819.08622: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203819.08739: Evaluated conditional (not __network_is_ostree is defined): True 13271 1727203819.08751: variable 'omit' from source: magic vars 13271 1727203819.08851: variable 'omit' from source: magic vars 13271 1727203819.08922: variable '__ostree_booted_stat' from source: set_fact 13271 1727203819.08982: variable 'omit' from source: magic vars 13271 1727203819.09014: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203819.09046: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203819.09078: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203819.09101: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203819.09118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203819.09151: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203819.09178: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.09182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.09269: Set connection var ansible_connection to ssh 13271 1727203819.09391: Set connection var ansible_shell_type to sh 13271 1727203819.09395: Set connection var ansible_timeout to 10 13271 1727203819.09397: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203819.09399: Set connection var ansible_pipelining to False 13271 1727203819.09401: Set connection var ansible_shell_executable to /bin/sh 13271 1727203819.09403: variable 'ansible_shell_executable' from source: unknown 13271 1727203819.09406: variable 'ansible_connection' from source: unknown 13271 1727203819.09408: variable 'ansible_module_compression' from source: unknown 13271 1727203819.09409: variable 'ansible_shell_type' from source: unknown 13271 1727203819.09411: variable 'ansible_shell_executable' from source: unknown 13271 1727203819.09413: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.09415: variable 'ansible_pipelining' from source: unknown 13271 1727203819.09417: variable 'ansible_timeout' from source: unknown 13271 1727203819.09419: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.09492: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203819.09533: variable 'omit' from source: magic vars 13271 1727203819.09536: starting attempt loop 13271 1727203819.09539: running the handler 13271 1727203819.09542: handler run complete 13271 1727203819.09553: attempt loop complete, returning result 13271 1727203819.09560: _execute() done 13271 1727203819.09610: dumping result to json 13271 1727203819.09613: done dumping result, returning 13271 1727203819.09615: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [028d2410-947f-2a40-12ba-0000000000e0] 13271 1727203819.09617: sending task result for task 028d2410-947f-2a40-12ba-0000000000e0 ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 13271 1727203819.09741: no more pending results, returning what we have 13271 1727203819.09744: results queue empty 13271 1727203819.09745: checking for any_errors_fatal 13271 1727203819.09753: done checking for any_errors_fatal 13271 1727203819.09754: checking for max_fail_percentage 13271 1727203819.09756: done checking for max_fail_percentage 13271 1727203819.09757: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.09758: done checking to see if all hosts have failed 13271 1727203819.09759: getting the remaining hosts for this loop 13271 1727203819.09760: done getting the remaining hosts for this loop 13271 1727203819.09763: getting the next task for host managed-node1 13271 1727203819.09773: done getting next task for host managed-node1 13271 1727203819.09878: ^ task is: TASK: Fix CentOS6 Base repo 13271 1727203819.09882: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.09886: getting variables 13271 1727203819.09888: in VariableManager get_vars() 13271 1727203819.09921: Calling all_inventory to load vars for managed-node1 13271 1727203819.09923: Calling groups_inventory to load vars for managed-node1 13271 1727203819.09927: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.09939: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.09942: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.09945: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.10243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.10516: done with get_vars() 13271 1727203819.10526: done getting variables 13271 1727203819.10556: done sending task result for task 028d2410-947f-2a40-12ba-0000000000e0 13271 1727203819.10558: WORKER PROCESS EXITING 13271 1727203819.10652: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.039) 0:00:02.749 ***** 13271 1727203819.10684: entering _queue_task() for managed-node1/copy 13271 1727203819.10930: worker is 1 (out of 1 available) 13271 1727203819.10941: exiting _queue_task() for managed-node1/copy 13271 1727203819.10951: done queuing things up, now waiting for results queue to drain 13271 1727203819.10953: waiting for pending results... 13271 1727203819.11196: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 13271 1727203819.11302: in run() - task 028d2410-947f-2a40-12ba-0000000000e2 13271 1727203819.11322: variable 'ansible_search_path' from source: unknown 13271 1727203819.11331: variable 'ansible_search_path' from source: unknown 13271 1727203819.11383: calling self._execute() 13271 1727203819.11470: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.11484: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.11498: variable 'omit' from source: magic vars 13271 1727203819.11981: variable 'ansible_distribution' from source: facts 13271 1727203819.12080: Evaluated conditional (ansible_distribution == 'CentOS'): True 13271 1727203819.12140: variable 'ansible_distribution_major_version' from source: facts 13271 1727203819.12151: Evaluated conditional (ansible_distribution_major_version == '6'): False 13271 1727203819.12159: when evaluation is False, skipping this task 13271 1727203819.12167: _execute() done 13271 1727203819.12175: dumping result to json 13271 1727203819.12185: done dumping result, returning 13271 1727203819.12195: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [028d2410-947f-2a40-12ba-0000000000e2] 13271 1727203819.12211: sending task result for task 028d2410-947f-2a40-12ba-0000000000e2 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13271 1727203819.12383: no more pending results, returning what we have 13271 1727203819.12387: results queue empty 13271 1727203819.12388: checking for any_errors_fatal 13271 1727203819.12394: done checking for any_errors_fatal 13271 1727203819.12394: checking for max_fail_percentage 13271 1727203819.12396: done checking for max_fail_percentage 13271 1727203819.12397: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.12398: done checking to see if all hosts have failed 13271 1727203819.12399: getting the remaining hosts for this loop 13271 1727203819.12401: done getting the remaining hosts for this loop 13271 1727203819.12404: getting the next task for host managed-node1 13271 1727203819.12411: done getting next task for host managed-node1 13271 1727203819.12414: ^ task is: TASK: Include the task 'enable_epel.yml' 13271 1727203819.12418: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.12482: getting variables 13271 1727203819.12485: in VariableManager get_vars() 13271 1727203819.12513: Calling all_inventory to load vars for managed-node1 13271 1727203819.12516: Calling groups_inventory to load vars for managed-node1 13271 1727203819.12520: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.12649: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.12654: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.12659: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.12910: done sending task result for task 028d2410-947f-2a40-12ba-0000000000e2 13271 1727203819.12914: WORKER PROCESS EXITING 13271 1727203819.12937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.13153: done with get_vars() 13271 1727203819.13162: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.025) 0:00:02.775 ***** 13271 1727203819.13259: entering _queue_task() for managed-node1/include_tasks 13271 1727203819.13505: worker is 1 (out of 1 available) 13271 1727203819.13519: exiting _queue_task() for managed-node1/include_tasks 13271 1727203819.13529: done queuing things up, now waiting for results queue to drain 13271 1727203819.13530: waiting for pending results... 13271 1727203819.13740: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 13271 1727203819.13832: in run() - task 028d2410-947f-2a40-12ba-0000000000e3 13271 1727203819.13847: variable 'ansible_search_path' from source: unknown 13271 1727203819.13861: variable 'ansible_search_path' from source: unknown 13271 1727203819.13898: calling self._execute() 13271 1727203819.13981: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.13984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.14078: variable 'omit' from source: magic vars 13271 1727203819.14456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203819.16616: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203819.16701: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203819.16744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203819.16787: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203819.16826: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203819.16917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203819.16953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203819.17013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203819.17042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203819.17065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203819.17196: variable '__network_is_ostree' from source: set_fact 13271 1727203819.17231: Evaluated conditional (not __network_is_ostree | d(false)): True 13271 1727203819.17381: _execute() done 13271 1727203819.17384: dumping result to json 13271 1727203819.17386: done dumping result, returning 13271 1727203819.17389: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [028d2410-947f-2a40-12ba-0000000000e3] 13271 1727203819.17391: sending task result for task 028d2410-947f-2a40-12ba-0000000000e3 13271 1727203819.17463: done sending task result for task 028d2410-947f-2a40-12ba-0000000000e3 13271 1727203819.17466: WORKER PROCESS EXITING 13271 1727203819.17497: no more pending results, returning what we have 13271 1727203819.17503: in VariableManager get_vars() 13271 1727203819.17539: Calling all_inventory to load vars for managed-node1 13271 1727203819.17542: Calling groups_inventory to load vars for managed-node1 13271 1727203819.17546: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.17557: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.17561: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.17564: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.17858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.18158: done with get_vars() 13271 1727203819.18167: variable 'ansible_search_path' from source: unknown 13271 1727203819.18169: variable 'ansible_search_path' from source: unknown 13271 1727203819.18209: we have included files to process 13271 1727203819.18210: generating all_blocks data 13271 1727203819.18212: done generating all_blocks data 13271 1727203819.18217: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13271 1727203819.18219: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13271 1727203819.18226: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 13271 1727203819.18933: done processing included file 13271 1727203819.18935: iterating over new_blocks loaded from include file 13271 1727203819.18937: in VariableManager get_vars() 13271 1727203819.18948: done with get_vars() 13271 1727203819.18949: filtering new block on tags 13271 1727203819.18969: done filtering new block on tags 13271 1727203819.18971: in VariableManager get_vars() 13271 1727203819.18985: done with get_vars() 13271 1727203819.18987: filtering new block on tags 13271 1727203819.18997: done filtering new block on tags 13271 1727203819.18998: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 13271 1727203819.19004: extending task lists for all hosts with included blocks 13271 1727203819.19097: done extending task lists 13271 1727203819.19098: done processing included files 13271 1727203819.19099: results queue empty 13271 1727203819.19100: checking for any_errors_fatal 13271 1727203819.19103: done checking for any_errors_fatal 13271 1727203819.19104: checking for max_fail_percentage 13271 1727203819.19105: done checking for max_fail_percentage 13271 1727203819.19106: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.19106: done checking to see if all hosts have failed 13271 1727203819.19107: getting the remaining hosts for this loop 13271 1727203819.19108: done getting the remaining hosts for this loop 13271 1727203819.19110: getting the next task for host managed-node1 13271 1727203819.19114: done getting next task for host managed-node1 13271 1727203819.19116: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 13271 1727203819.19118: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.19120: getting variables 13271 1727203819.19121: in VariableManager get_vars() 13271 1727203819.19129: Calling all_inventory to load vars for managed-node1 13271 1727203819.19131: Calling groups_inventory to load vars for managed-node1 13271 1727203819.19133: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.19138: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.19145: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.19148: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.19305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.19486: done with get_vars() 13271 1727203819.19493: done getting variables 13271 1727203819.19559: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13271 1727203819.19768: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.065) 0:00:02.841 ***** 13271 1727203819.19815: entering _queue_task() for managed-node1/command 13271 1727203819.19817: Creating lock for command 13271 1727203819.20147: worker is 1 (out of 1 available) 13271 1727203819.20159: exiting _queue_task() for managed-node1/command 13271 1727203819.20174: done queuing things up, now waiting for results queue to drain 13271 1727203819.20178: waiting for pending results... 13271 1727203819.20404: running TaskExecutor() for managed-node1/TASK: Create EPEL 10 13271 1727203819.20490: in run() - task 028d2410-947f-2a40-12ba-0000000000fd 13271 1727203819.20515: variable 'ansible_search_path' from source: unknown 13271 1727203819.20580: variable 'ansible_search_path' from source: unknown 13271 1727203819.20584: calling self._execute() 13271 1727203819.20647: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.20660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.20674: variable 'omit' from source: magic vars 13271 1727203819.21066: variable 'ansible_distribution' from source: facts 13271 1727203819.21082: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13271 1727203819.21224: variable 'ansible_distribution_major_version' from source: facts 13271 1727203819.21238: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13271 1727203819.21260: when evaluation is False, skipping this task 13271 1727203819.21263: _execute() done 13271 1727203819.21266: dumping result to json 13271 1727203819.21268: done dumping result, returning 13271 1727203819.21340: done running TaskExecutor() for managed-node1/TASK: Create EPEL 10 [028d2410-947f-2a40-12ba-0000000000fd] 13271 1727203819.21343: sending task result for task 028d2410-947f-2a40-12ba-0000000000fd 13271 1727203819.21419: done sending task result for task 028d2410-947f-2a40-12ba-0000000000fd 13271 1727203819.21421: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13271 1727203819.21502: no more pending results, returning what we have 13271 1727203819.21506: results queue empty 13271 1727203819.21507: checking for any_errors_fatal 13271 1727203819.21508: done checking for any_errors_fatal 13271 1727203819.21509: checking for max_fail_percentage 13271 1727203819.21510: done checking for max_fail_percentage 13271 1727203819.21511: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.21512: done checking to see if all hosts have failed 13271 1727203819.21513: getting the remaining hosts for this loop 13271 1727203819.21514: done getting the remaining hosts for this loop 13271 1727203819.21518: getting the next task for host managed-node1 13271 1727203819.21525: done getting next task for host managed-node1 13271 1727203819.21528: ^ task is: TASK: Install yum-utils package 13271 1727203819.21532: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.21536: getting variables 13271 1727203819.21538: in VariableManager get_vars() 13271 1727203819.21569: Calling all_inventory to load vars for managed-node1 13271 1727203819.21571: Calling groups_inventory to load vars for managed-node1 13271 1727203819.21577: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.21591: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.21594: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.21597: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.21982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.22165: done with get_vars() 13271 1727203819.22178: done getting variables 13271 1727203819.22290: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.025) 0:00:02.866 ***** 13271 1727203819.22326: entering _queue_task() for managed-node1/package 13271 1727203819.22327: Creating lock for package 13271 1727203819.22603: worker is 1 (out of 1 available) 13271 1727203819.22616: exiting _queue_task() for managed-node1/package 13271 1727203819.22628: done queuing things up, now waiting for results queue to drain 13271 1727203819.22630: waiting for pending results... 13271 1727203819.22901: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 13271 1727203819.22985: in run() - task 028d2410-947f-2a40-12ba-0000000000fe 13271 1727203819.22988: variable 'ansible_search_path' from source: unknown 13271 1727203819.22991: variable 'ansible_search_path' from source: unknown 13271 1727203819.22993: calling self._execute() 13271 1727203819.23149: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.23163: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.23181: variable 'omit' from source: magic vars 13271 1727203819.23549: variable 'ansible_distribution' from source: facts 13271 1727203819.23780: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13271 1727203819.23784: variable 'ansible_distribution_major_version' from source: facts 13271 1727203819.23786: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13271 1727203819.23789: when evaluation is False, skipping this task 13271 1727203819.23792: _execute() done 13271 1727203819.23794: dumping result to json 13271 1727203819.23797: done dumping result, returning 13271 1727203819.23800: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [028d2410-947f-2a40-12ba-0000000000fe] 13271 1727203819.23802: sending task result for task 028d2410-947f-2a40-12ba-0000000000fe 13271 1727203819.23871: done sending task result for task 028d2410-947f-2a40-12ba-0000000000fe 13271 1727203819.23876: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13271 1727203819.23974: no more pending results, returning what we have 13271 1727203819.23979: results queue empty 13271 1727203819.23980: checking for any_errors_fatal 13271 1727203819.23984: done checking for any_errors_fatal 13271 1727203819.23984: checking for max_fail_percentage 13271 1727203819.23986: done checking for max_fail_percentage 13271 1727203819.23987: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.23988: done checking to see if all hosts have failed 13271 1727203819.23988: getting the remaining hosts for this loop 13271 1727203819.23990: done getting the remaining hosts for this loop 13271 1727203819.23993: getting the next task for host managed-node1 13271 1727203819.23999: done getting next task for host managed-node1 13271 1727203819.24001: ^ task is: TASK: Enable EPEL 7 13271 1727203819.24004: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.24006: getting variables 13271 1727203819.24007: in VariableManager get_vars() 13271 1727203819.24027: Calling all_inventory to load vars for managed-node1 13271 1727203819.24029: Calling groups_inventory to load vars for managed-node1 13271 1727203819.24032: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.24041: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.24043: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.24045: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.24236: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.24434: done with get_vars() 13271 1727203819.24443: done getting variables 13271 1727203819.24508: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.022) 0:00:02.888 ***** 13271 1727203819.24543: entering _queue_task() for managed-node1/command 13271 1727203819.24814: worker is 1 (out of 1 available) 13271 1727203819.24826: exiting _queue_task() for managed-node1/command 13271 1727203819.24835: done queuing things up, now waiting for results queue to drain 13271 1727203819.24836: waiting for pending results... 13271 1727203819.25040: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 13271 1727203819.25193: in run() - task 028d2410-947f-2a40-12ba-0000000000ff 13271 1727203819.25196: variable 'ansible_search_path' from source: unknown 13271 1727203819.25199: variable 'ansible_search_path' from source: unknown 13271 1727203819.25209: calling self._execute() 13271 1727203819.25283: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.25297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.25312: variable 'omit' from source: magic vars 13271 1727203819.25684: variable 'ansible_distribution' from source: facts 13271 1727203819.25736: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13271 1727203819.25835: variable 'ansible_distribution_major_version' from source: facts 13271 1727203819.25855: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13271 1727203819.25867: when evaluation is False, skipping this task 13271 1727203819.25877: _execute() done 13271 1727203819.25954: dumping result to json 13271 1727203819.25958: done dumping result, returning 13271 1727203819.25964: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [028d2410-947f-2a40-12ba-0000000000ff] 13271 1727203819.25967: sending task result for task 028d2410-947f-2a40-12ba-0000000000ff 13271 1727203819.26032: done sending task result for task 028d2410-947f-2a40-12ba-0000000000ff 13271 1727203819.26036: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13271 1727203819.26109: no more pending results, returning what we have 13271 1727203819.26113: results queue empty 13271 1727203819.26114: checking for any_errors_fatal 13271 1727203819.26120: done checking for any_errors_fatal 13271 1727203819.26121: checking for max_fail_percentage 13271 1727203819.26123: done checking for max_fail_percentage 13271 1727203819.26124: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.26125: done checking to see if all hosts have failed 13271 1727203819.26126: getting the remaining hosts for this loop 13271 1727203819.26127: done getting the remaining hosts for this loop 13271 1727203819.26130: getting the next task for host managed-node1 13271 1727203819.26137: done getting next task for host managed-node1 13271 1727203819.26140: ^ task is: TASK: Enable EPEL 8 13271 1727203819.26143: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.26148: getting variables 13271 1727203819.26150: in VariableManager get_vars() 13271 1727203819.26294: Calling all_inventory to load vars for managed-node1 13271 1727203819.26297: Calling groups_inventory to load vars for managed-node1 13271 1727203819.26301: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.26312: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.26315: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.26319: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.26578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.26778: done with get_vars() 13271 1727203819.26788: done getting variables 13271 1727203819.26844: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.023) 0:00:02.911 ***** 13271 1727203819.26878: entering _queue_task() for managed-node1/command 13271 1727203819.27105: worker is 1 (out of 1 available) 13271 1727203819.27115: exiting _queue_task() for managed-node1/command 13271 1727203819.27125: done queuing things up, now waiting for results queue to drain 13271 1727203819.27127: waiting for pending results... 13271 1727203819.27367: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 13271 1727203819.27569: in run() - task 028d2410-947f-2a40-12ba-000000000100 13271 1727203819.27573: variable 'ansible_search_path' from source: unknown 13271 1727203819.27579: variable 'ansible_search_path' from source: unknown 13271 1727203819.27582: calling self._execute() 13271 1727203819.27627: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.27637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.27649: variable 'omit' from source: magic vars 13271 1727203819.28036: variable 'ansible_distribution' from source: facts 13271 1727203819.28053: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13271 1727203819.28242: variable 'ansible_distribution_major_version' from source: facts 13271 1727203819.28245: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 13271 1727203819.28248: when evaluation is False, skipping this task 13271 1727203819.28250: _execute() done 13271 1727203819.28253: dumping result to json 13271 1727203819.28255: done dumping result, returning 13271 1727203819.28257: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [028d2410-947f-2a40-12ba-000000000100] 13271 1727203819.28262: sending task result for task 028d2410-947f-2a40-12ba-000000000100 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 13271 1727203819.28427: no more pending results, returning what we have 13271 1727203819.28431: results queue empty 13271 1727203819.28432: checking for any_errors_fatal 13271 1727203819.28438: done checking for any_errors_fatal 13271 1727203819.28439: checking for max_fail_percentage 13271 1727203819.28441: done checking for max_fail_percentage 13271 1727203819.28442: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.28443: done checking to see if all hosts have failed 13271 1727203819.28444: getting the remaining hosts for this loop 13271 1727203819.28445: done getting the remaining hosts for this loop 13271 1727203819.28448: getting the next task for host managed-node1 13271 1727203819.28458: done getting next task for host managed-node1 13271 1727203819.28463: ^ task is: TASK: Enable EPEL 6 13271 1727203819.28467: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.28472: getting variables 13271 1727203819.28474: in VariableManager get_vars() 13271 1727203819.28507: Calling all_inventory to load vars for managed-node1 13271 1727203819.28509: Calling groups_inventory to load vars for managed-node1 13271 1727203819.28513: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.28527: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.28531: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.28534: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.28909: done sending task result for task 028d2410-947f-2a40-12ba-000000000100 13271 1727203819.28912: WORKER PROCESS EXITING 13271 1727203819.28934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.29133: done with get_vars() 13271 1727203819.29142: done getting variables 13271 1727203819.29205: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.023) 0:00:02.935 ***** 13271 1727203819.29235: entering _queue_task() for managed-node1/copy 13271 1727203819.29483: worker is 1 (out of 1 available) 13271 1727203819.29494: exiting _queue_task() for managed-node1/copy 13271 1727203819.29506: done queuing things up, now waiting for results queue to drain 13271 1727203819.29508: waiting for pending results... 13271 1727203819.29737: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 13271 1727203819.29855: in run() - task 028d2410-947f-2a40-12ba-000000000102 13271 1727203819.29875: variable 'ansible_search_path' from source: unknown 13271 1727203819.29886: variable 'ansible_search_path' from source: unknown 13271 1727203819.29926: calling self._execute() 13271 1727203819.30009: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.30023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.30038: variable 'omit' from source: magic vars 13271 1727203819.30394: variable 'ansible_distribution' from source: facts 13271 1727203819.30482: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 13271 1727203819.30521: variable 'ansible_distribution_major_version' from source: facts 13271 1727203819.30530: Evaluated conditional (ansible_distribution_major_version == '6'): False 13271 1727203819.30536: when evaluation is False, skipping this task 13271 1727203819.30542: _execute() done 13271 1727203819.30548: dumping result to json 13271 1727203819.30553: done dumping result, returning 13271 1727203819.30561: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [028d2410-947f-2a40-12ba-000000000102] 13271 1727203819.30570: sending task result for task 028d2410-947f-2a40-12ba-000000000102 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 13271 1727203819.30711: no more pending results, returning what we have 13271 1727203819.30714: results queue empty 13271 1727203819.30715: checking for any_errors_fatal 13271 1727203819.30720: done checking for any_errors_fatal 13271 1727203819.30721: checking for max_fail_percentage 13271 1727203819.30723: done checking for max_fail_percentage 13271 1727203819.30724: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.30725: done checking to see if all hosts have failed 13271 1727203819.30726: getting the remaining hosts for this loop 13271 1727203819.30727: done getting the remaining hosts for this loop 13271 1727203819.30730: getting the next task for host managed-node1 13271 1727203819.30738: done getting next task for host managed-node1 13271 1727203819.30740: ^ task is: TASK: Set network provider to 'nm' 13271 1727203819.30743: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.30747: getting variables 13271 1727203819.30748: in VariableManager get_vars() 13271 1727203819.30780: Calling all_inventory to load vars for managed-node1 13271 1727203819.30782: Calling groups_inventory to load vars for managed-node1 13271 1727203819.30785: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.30798: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.30801: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.30805: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.31202: done sending task result for task 028d2410-947f-2a40-12ba-000000000102 13271 1727203819.31205: WORKER PROCESS EXITING 13271 1727203819.31227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.31418: done with get_vars() 13271 1727203819.31426: done getting variables 13271 1727203819.31494: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.022) 0:00:02.958 ***** 13271 1727203819.31519: entering _queue_task() for managed-node1/set_fact 13271 1727203819.31773: worker is 1 (out of 1 available) 13271 1727203819.31793: exiting _queue_task() for managed-node1/set_fact 13271 1727203819.31803: done queuing things up, now waiting for results queue to drain 13271 1727203819.31804: waiting for pending results... 13271 1727203819.32029: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 13271 1727203819.32130: in run() - task 028d2410-947f-2a40-12ba-000000000007 13271 1727203819.32150: variable 'ansible_search_path' from source: unknown 13271 1727203819.32197: calling self._execute() 13271 1727203819.32551: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.32555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.32558: variable 'omit' from source: magic vars 13271 1727203819.32560: variable 'omit' from source: magic vars 13271 1727203819.32581: variable 'omit' from source: magic vars 13271 1727203819.32621: variable 'omit' from source: magic vars 13271 1727203819.32672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203819.32716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203819.32744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203819.32772: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203819.32791: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203819.32824: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203819.32832: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.32839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.32940: Set connection var ansible_connection to ssh 13271 1727203819.32951: Set connection var ansible_shell_type to sh 13271 1727203819.32963: Set connection var ansible_timeout to 10 13271 1727203819.32973: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203819.32990: Set connection var ansible_pipelining to False 13271 1727203819.33000: Set connection var ansible_shell_executable to /bin/sh 13271 1727203819.33028: variable 'ansible_shell_executable' from source: unknown 13271 1727203819.33036: variable 'ansible_connection' from source: unknown 13271 1727203819.33043: variable 'ansible_module_compression' from source: unknown 13271 1727203819.33050: variable 'ansible_shell_type' from source: unknown 13271 1727203819.33058: variable 'ansible_shell_executable' from source: unknown 13271 1727203819.33066: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.33074: variable 'ansible_pipelining' from source: unknown 13271 1727203819.33088: variable 'ansible_timeout' from source: unknown 13271 1727203819.33100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.33242: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203819.33255: variable 'omit' from source: magic vars 13271 1727203819.33264: starting attempt loop 13271 1727203819.33270: running the handler 13271 1727203819.33286: handler run complete 13271 1727203819.33297: attempt loop complete, returning result 13271 1727203819.33381: _execute() done 13271 1727203819.33385: dumping result to json 13271 1727203819.33388: done dumping result, returning 13271 1727203819.33390: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [028d2410-947f-2a40-12ba-000000000007] 13271 1727203819.33393: sending task result for task 028d2410-947f-2a40-12ba-000000000007 ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 13271 1727203819.33584: no more pending results, returning what we have 13271 1727203819.33588: results queue empty 13271 1727203819.33588: checking for any_errors_fatal 13271 1727203819.33596: done checking for any_errors_fatal 13271 1727203819.33596: checking for max_fail_percentage 13271 1727203819.33598: done checking for max_fail_percentage 13271 1727203819.33599: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.33601: done checking to see if all hosts have failed 13271 1727203819.33601: getting the remaining hosts for this loop 13271 1727203819.33603: done getting the remaining hosts for this loop 13271 1727203819.33607: getting the next task for host managed-node1 13271 1727203819.33616: done getting next task for host managed-node1 13271 1727203819.33618: ^ task is: TASK: meta (flush_handlers) 13271 1727203819.33620: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.33624: getting variables 13271 1727203819.33626: in VariableManager get_vars() 13271 1727203819.33660: Calling all_inventory to load vars for managed-node1 13271 1727203819.33663: Calling groups_inventory to load vars for managed-node1 13271 1727203819.33667: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.33792: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.33796: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.33801: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.34018: done sending task result for task 028d2410-947f-2a40-12ba-000000000007 13271 1727203819.34021: WORKER PROCESS EXITING 13271 1727203819.34044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.34235: done with get_vars() 13271 1727203819.34245: done getting variables 13271 1727203819.34313: in VariableManager get_vars() 13271 1727203819.34322: Calling all_inventory to load vars for managed-node1 13271 1727203819.34325: Calling groups_inventory to load vars for managed-node1 13271 1727203819.34327: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.34337: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.34339: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.34342: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.34477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.34691: done with get_vars() 13271 1727203819.34704: done queuing things up, now waiting for results queue to drain 13271 1727203819.34706: results queue empty 13271 1727203819.34707: checking for any_errors_fatal 13271 1727203819.34709: done checking for any_errors_fatal 13271 1727203819.34710: checking for max_fail_percentage 13271 1727203819.34711: done checking for max_fail_percentage 13271 1727203819.34712: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.34712: done checking to see if all hosts have failed 13271 1727203819.34713: getting the remaining hosts for this loop 13271 1727203819.34714: done getting the remaining hosts for this loop 13271 1727203819.34716: getting the next task for host managed-node1 13271 1727203819.34720: done getting next task for host managed-node1 13271 1727203819.34721: ^ task is: TASK: meta (flush_handlers) 13271 1727203819.34723: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.34731: getting variables 13271 1727203819.34732: in VariableManager get_vars() 13271 1727203819.34740: Calling all_inventory to load vars for managed-node1 13271 1727203819.34742: Calling groups_inventory to load vars for managed-node1 13271 1727203819.34744: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.34748: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.34750: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.34753: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.34893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.35068: done with get_vars() 13271 1727203819.35078: done getting variables 13271 1727203819.35125: in VariableManager get_vars() 13271 1727203819.35133: Calling all_inventory to load vars for managed-node1 13271 1727203819.35135: Calling groups_inventory to load vars for managed-node1 13271 1727203819.35137: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.35141: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.35143: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.35146: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.35280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.35488: done with get_vars() 13271 1727203819.35499: done queuing things up, now waiting for results queue to drain 13271 1727203819.35501: results queue empty 13271 1727203819.35501: checking for any_errors_fatal 13271 1727203819.35503: done checking for any_errors_fatal 13271 1727203819.35503: checking for max_fail_percentage 13271 1727203819.35504: done checking for max_fail_percentage 13271 1727203819.35505: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.35506: done checking to see if all hosts have failed 13271 1727203819.35506: getting the remaining hosts for this loop 13271 1727203819.35507: done getting the remaining hosts for this loop 13271 1727203819.35510: getting the next task for host managed-node1 13271 1727203819.35512: done getting next task for host managed-node1 13271 1727203819.35513: ^ task is: None 13271 1727203819.35515: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.35516: done queuing things up, now waiting for results queue to drain 13271 1727203819.35517: results queue empty 13271 1727203819.35517: checking for any_errors_fatal 13271 1727203819.35518: done checking for any_errors_fatal 13271 1727203819.35519: checking for max_fail_percentage 13271 1727203819.35520: done checking for max_fail_percentage 13271 1727203819.35520: checking to see if all hosts have failed and the running result is not ok 13271 1727203819.35521: done checking to see if all hosts have failed 13271 1727203819.35523: getting the next task for host managed-node1 13271 1727203819.35525: done getting next task for host managed-node1 13271 1727203819.35531: ^ task is: None 13271 1727203819.35532: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.35582: in VariableManager get_vars() 13271 1727203819.35604: done with get_vars() 13271 1727203819.35610: in VariableManager get_vars() 13271 1727203819.35622: done with get_vars() 13271 1727203819.35625: variable 'omit' from source: magic vars 13271 1727203819.35657: in VariableManager get_vars() 13271 1727203819.35670: done with get_vars() 13271 1727203819.35690: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 13271 1727203819.36304: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 13271 1727203819.36332: getting the remaining hosts for this loop 13271 1727203819.36333: done getting the remaining hosts for this loop 13271 1727203819.36336: getting the next task for host managed-node1 13271 1727203819.36338: done getting next task for host managed-node1 13271 1727203819.36340: ^ task is: TASK: Gathering Facts 13271 1727203819.36341: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203819.36343: getting variables 13271 1727203819.36344: in VariableManager get_vars() 13271 1727203819.36356: Calling all_inventory to load vars for managed-node1 13271 1727203819.36358: Calling groups_inventory to load vars for managed-node1 13271 1727203819.36360: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203819.36365: Calling all_plugins_play to load vars for managed-node1 13271 1727203819.36381: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203819.36385: Calling groups_plugins_play to load vars for managed-node1 13271 1727203819.36524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203819.36702: done with get_vars() 13271 1727203819.36710: done getting variables 13271 1727203819.36754: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Tuesday 24 September 2024 14:50:19 -0400 (0:00:00.052) 0:00:03.010 ***** 13271 1727203819.36781: entering _queue_task() for managed-node1/gather_facts 13271 1727203819.37089: worker is 1 (out of 1 available) 13271 1727203819.37100: exiting _queue_task() for managed-node1/gather_facts 13271 1727203819.37112: done queuing things up, now waiting for results queue to drain 13271 1727203819.37113: waiting for pending results... 13271 1727203819.37393: running TaskExecutor() for managed-node1/TASK: Gathering Facts 13271 1727203819.37430: in run() - task 028d2410-947f-2a40-12ba-000000000128 13271 1727203819.37451: variable 'ansible_search_path' from source: unknown 13271 1727203819.37502: calling self._execute() 13271 1727203819.37589: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.37607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.37620: variable 'omit' from source: magic vars 13271 1727203819.38026: variable 'ansible_distribution_major_version' from source: facts 13271 1727203819.38029: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203819.38032: variable 'omit' from source: magic vars 13271 1727203819.38034: variable 'omit' from source: magic vars 13271 1727203819.38134: variable 'omit' from source: magic vars 13271 1727203819.38137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203819.38164: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203819.38191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203819.38210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203819.38223: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203819.38264: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203819.38272: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.38280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.38383: Set connection var ansible_connection to ssh 13271 1727203819.38395: Set connection var ansible_shell_type to sh 13271 1727203819.38407: Set connection var ansible_timeout to 10 13271 1727203819.38416: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203819.38425: Set connection var ansible_pipelining to False 13271 1727203819.38434: Set connection var ansible_shell_executable to /bin/sh 13271 1727203819.38567: variable 'ansible_shell_executable' from source: unknown 13271 1727203819.38571: variable 'ansible_connection' from source: unknown 13271 1727203819.38573: variable 'ansible_module_compression' from source: unknown 13271 1727203819.38578: variable 'ansible_shell_type' from source: unknown 13271 1727203819.38580: variable 'ansible_shell_executable' from source: unknown 13271 1727203819.38582: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203819.38584: variable 'ansible_pipelining' from source: unknown 13271 1727203819.38586: variable 'ansible_timeout' from source: unknown 13271 1727203819.38588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203819.38692: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203819.38714: variable 'omit' from source: magic vars 13271 1727203819.38722: starting attempt loop 13271 1727203819.38728: running the handler 13271 1727203819.38745: variable 'ansible_facts' from source: unknown 13271 1727203819.38766: _low_level_execute_command(): starting 13271 1727203819.38814: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203819.39607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203819.39619: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203819.39680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.39747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203819.39757: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203819.39788: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203819.39890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203819.41673: stdout chunk (state=3): >>>/root <<< 13271 1727203819.41839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203819.41843: stdout chunk (state=3): >>><<< 13271 1727203819.41845: stderr chunk (state=3): >>><<< 13271 1727203819.41873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203819.41963: _low_level_execute_command(): starting 13271 1727203819.41967: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712 `" && echo ansible-tmp-1727203819.4188147-13582-64732345401712="` echo /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712 `" ) && sleep 0' 13271 1727203819.42586: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203819.42653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203819.42657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203819.42662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203819.42665: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203819.42677: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203819.42680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.42682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203819.42690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.42745: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203819.42763: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203819.42774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203819.42878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203819.45194: stdout chunk (state=3): >>>ansible-tmp-1727203819.4188147-13582-64732345401712=/root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712 <<< 13271 1727203819.45269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203819.45273: stdout chunk (state=3): >>><<< 13271 1727203819.45281: stderr chunk (state=3): >>><<< 13271 1727203819.45300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203819.4188147-13582-64732345401712=/root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203819.45480: variable 'ansible_module_compression' from source: unknown 13271 1727203819.45484: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 13271 1727203819.45486: variable 'ansible_facts' from source: unknown 13271 1727203819.45826: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/AnsiballZ_setup.py 13271 1727203819.46218: Sending initial data 13271 1727203819.46221: Sent initial data (153 bytes) 13271 1727203819.47537: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203819.47714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203819.47729: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203819.47741: stderr chunk (state=3): >>>debug2: match found <<< 13271 1727203819.47756: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.47999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203819.48002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203819.48106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203819.49839: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203819.49941: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203819.50043: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpzk0yeyu2 /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/AnsiballZ_setup.py <<< 13271 1727203819.50058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/AnsiballZ_setup.py" <<< 13271 1727203819.50212: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpzk0yeyu2" to remote "/root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/AnsiballZ_setup.py" <<< 13271 1727203819.53415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203819.53450: stderr chunk (state=3): >>><<< 13271 1727203819.53685: stdout chunk (state=3): >>><<< 13271 1727203819.53689: done transferring module to remote 13271 1727203819.53691: _low_level_execute_command(): starting 13271 1727203819.53693: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/ /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/AnsiballZ_setup.py && sleep 0' 13271 1727203819.54841: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203819.54857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203819.54865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203819.54970: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203819.54984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.55026: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203819.55077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203819.55208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203819.57258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203819.57290: stderr chunk (state=3): >>><<< 13271 1727203819.57332: stdout chunk (state=3): >>><<< 13271 1727203819.57358: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203819.57391: _low_level_execute_command(): starting 13271 1727203819.57422: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/AnsiballZ_setup.py && sleep 0' 13271 1727203819.58711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203819.58799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203819.58841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203819.58853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203819.58974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203819.59089: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203820.25722: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2928, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 603, "free": 2928}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": <<< 13271 1727203820.25771: stdout chunk (state=3): >>>"NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 411, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785694208, "block_size": 4096, "block_total": 65519099, "block_available": 63912523, "block_used": 1606576, "inode_total": 131070960, "inode_available": 131027285, "inode_used": 43675, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "20", "epoch": "1727203820", "epoch_int": "1727203820", "date": "2024-09-24", "time": "14:50:20", "iso8601_micro": "2024-09-24T18:50:20.206512Z", "iso8601": "2024-09-24T18:50:20Z", "iso8601_basic": "20240924T145020206512", "iso8601_basic_short": "20240924T145020", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.52392578125, "5m": 0.28857421875, "15m": 0.13525390625}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 13271 1727203820.28156: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203820.28182: stderr chunk (state=3): >>><<< 13271 1727203820.28191: stdout chunk (state=3): >>><<< 13271 1727203820.28229: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCWk3MCbpUJBEaXgG200pw3DBo34ukitT+wfYcFTuNXyUUPaXQe7v940X0cf5U78BgS3AAiRxfHirMb+4r43rwxBe5tl4Vq2WM+kz3JnOtxK8ZXTmwS9PbltzX5fg5CVds9Bu6KIwABJMlgT9CTHVjFlTBc8wpoeJvx8wVgMwQlnF+PFN/lzD0gn0sF11bqe2QCvxmm9r7Lum/QdIVGgOiZMMbULJZb0Iih24Tn74Ho6n9zLSFZ5FiFifjm7M6k1hVtfcAQi3GfPdUxkN0Z66f3KaW4hptFlHxttjLliuAyfNF4UrXIab7y/nDix1ll4x4lLHsVRpcwtmVVe/Z+2/pcmRv4fjw8YzWY1pLV5u1BUUBthICgfv0pXEuLP9UD/krnjy0Ho9gAbtjdoRWtn7gvjRds+WEwk83rZS3UnAc3pl2DmHzbp4IfRC1zp8eJPJoVwcSTEr61su59tkNntjdKAcLWeas1p9XBaTIRx7aqRp7Vdet96xbSnDnBCriXgAU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJzECuW8BnvEbYnQxnxmebvg9TYk9r0OUd9aUg8FFv4MvjSzW8tCfnW556hw9n4PI2hShtAWz7XExrMZPlTQXRo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINUE+mKAHEgVl/vTdVMwRCu3lDCTOYBl1RcikvxylCeg", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.45.138 50362 10.31.14.47 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.45.138 50362 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_system": "Linux", "ansible_kernel": "6.11.0-25.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Mon Sep 16 20:35:26 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec277914f6c5b9c03bd977e30033112b", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2928, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 603, "free": 2928}, "nocache": {"free": 3280, "used": 251}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_uuid": "ec277914-f6c5-b9c0-3bd9-77e30033112b", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["973ca870-ed1b-4e56-a8b4-735608119a28"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["973ca870-ed1b-4e56-a8b4-735608119a28"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 411, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261785694208, "block_size": 4096, "block_total": 65519099, "block_available": 63912523, "block_used": 1606576, "inode_total": 131070960, "inode_available": 131027285, "inode_used": 43675, "uuid": "973ca870-ed1b-4e56-a8b4-735608119a28"}], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "50", "second": "20", "epoch": "1727203820", "epoch_int": "1727203820", "date": "2024-09-24", "time": "14:50:20", "iso8601_micro": "2024-09-24T18:50:20.206512Z", "iso8601": "2024-09-24T18:50:20Z", "iso8601_basic": "20240924T145020206512", "iso8601_basic_short": "20240924T145020", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-25.el10.x86_64", "root": "UUID=973ca870-ed1b-4e56-a8b4-735608119a28", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.52392578125, "5m": 0.28857421875, "15m": 0.13525390625}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:11e86335-d786-4518-8abc-c9417b351256", "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:ddff:fe89:9be5", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.14.47", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:dd:89:9b:e5", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.14.47"], "ansible_all_ipv6_addresses": ["fe80::8ff:ddff:fe89:9be5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.14.47", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:ddff:fe89:9be5"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203820.28684: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203820.28687: _low_level_execute_command(): starting 13271 1727203820.28690: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203819.4188147-13582-64732345401712/ > /dev/null 2>&1 && sleep 0' 13271 1727203820.29334: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203820.29347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203820.29457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203820.29480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203820.29506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203820.29608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203820.31640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203820.31644: stdout chunk (state=3): >>><<< 13271 1727203820.31648: stderr chunk (state=3): >>><<< 13271 1727203820.31786: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203820.31790: handler run complete 13271 1727203820.31794: variable 'ansible_facts' from source: unknown 13271 1727203820.31912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203820.32238: variable 'ansible_facts' from source: unknown 13271 1727203820.32336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203820.32472: attempt loop complete, returning result 13271 1727203820.32483: _execute() done 13271 1727203820.32491: dumping result to json 13271 1727203820.32525: done dumping result, returning 13271 1727203820.32548: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [028d2410-947f-2a40-12ba-000000000128] 13271 1727203820.32559: sending task result for task 028d2410-947f-2a40-12ba-000000000128 ok: [managed-node1] 13271 1727203820.33426: no more pending results, returning what we have 13271 1727203820.33429: results queue empty 13271 1727203820.33429: checking for any_errors_fatal 13271 1727203820.33431: done checking for any_errors_fatal 13271 1727203820.33431: checking for max_fail_percentage 13271 1727203820.33433: done checking for max_fail_percentage 13271 1727203820.33433: checking to see if all hosts have failed and the running result is not ok 13271 1727203820.33434: done checking to see if all hosts have failed 13271 1727203820.33435: getting the remaining hosts for this loop 13271 1727203820.33436: done getting the remaining hosts for this loop 13271 1727203820.33440: getting the next task for host managed-node1 13271 1727203820.33445: done getting next task for host managed-node1 13271 1727203820.33447: ^ task is: TASK: meta (flush_handlers) 13271 1727203820.33449: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203820.33452: getting variables 13271 1727203820.33454: in VariableManager get_vars() 13271 1727203820.33539: Calling all_inventory to load vars for managed-node1 13271 1727203820.33542: Calling groups_inventory to load vars for managed-node1 13271 1727203820.33545: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203820.33551: done sending task result for task 028d2410-947f-2a40-12ba-000000000128 13271 1727203820.33554: WORKER PROCESS EXITING 13271 1727203820.33563: Calling all_plugins_play to load vars for managed-node1 13271 1727203820.33567: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203820.33570: Calling groups_plugins_play to load vars for managed-node1 13271 1727203820.33753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203820.33954: done with get_vars() 13271 1727203820.33964: done getting variables 13271 1727203820.34040: in VariableManager get_vars() 13271 1727203820.34054: Calling all_inventory to load vars for managed-node1 13271 1727203820.34056: Calling groups_inventory to load vars for managed-node1 13271 1727203820.34058: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203820.34062: Calling all_plugins_play to load vars for managed-node1 13271 1727203820.34064: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203820.34067: Calling groups_plugins_play to load vars for managed-node1 13271 1727203820.34207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203820.34397: done with get_vars() 13271 1727203820.34410: done queuing things up, now waiting for results queue to drain 13271 1727203820.34412: results queue empty 13271 1727203820.34413: checking for any_errors_fatal 13271 1727203820.34416: done checking for any_errors_fatal 13271 1727203820.34416: checking for max_fail_percentage 13271 1727203820.34417: done checking for max_fail_percentage 13271 1727203820.34418: checking to see if all hosts have failed and the running result is not ok 13271 1727203820.34422: done checking to see if all hosts have failed 13271 1727203820.34423: getting the remaining hosts for this loop 13271 1727203820.34424: done getting the remaining hosts for this loop 13271 1727203820.34426: getting the next task for host managed-node1 13271 1727203820.34430: done getting next task for host managed-node1 13271 1727203820.34432: ^ task is: TASK: INIT Prepare setup 13271 1727203820.34433: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203820.34435: getting variables 13271 1727203820.34436: in VariableManager get_vars() 13271 1727203820.34448: Calling all_inventory to load vars for managed-node1 13271 1727203820.34451: Calling groups_inventory to load vars for managed-node1 13271 1727203820.34452: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203820.34456: Calling all_plugins_play to load vars for managed-node1 13271 1727203820.34458: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203820.34461: Calling groups_plugins_play to load vars for managed-node1 13271 1727203820.34602: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203820.34822: done with get_vars() 13271 1727203820.34831: done getting variables 13271 1727203820.34913: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Tuesday 24 September 2024 14:50:20 -0400 (0:00:00.981) 0:00:03.992 ***** 13271 1727203820.34938: entering _queue_task() for managed-node1/debug 13271 1727203820.34940: Creating lock for debug 13271 1727203820.35279: worker is 1 (out of 1 available) 13271 1727203820.35291: exiting _queue_task() for managed-node1/debug 13271 1727203820.35302: done queuing things up, now waiting for results queue to drain 13271 1727203820.35304: waiting for pending results... 13271 1727203820.35540: running TaskExecutor() for managed-node1/TASK: INIT Prepare setup 13271 1727203820.35636: in run() - task 028d2410-947f-2a40-12ba-00000000000b 13271 1727203820.35658: variable 'ansible_search_path' from source: unknown 13271 1727203820.35712: calling self._execute() 13271 1727203820.35805: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203820.35818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203820.35834: variable 'omit' from source: magic vars 13271 1727203820.36220: variable 'ansible_distribution_major_version' from source: facts 13271 1727203820.36280: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203820.36284: variable 'omit' from source: magic vars 13271 1727203820.36287: variable 'omit' from source: magic vars 13271 1727203820.36311: variable 'omit' from source: magic vars 13271 1727203820.36358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203820.36402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203820.36436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203820.36460: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203820.36545: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203820.36549: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203820.36551: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203820.36554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203820.36629: Set connection var ansible_connection to ssh 13271 1727203820.36643: Set connection var ansible_shell_type to sh 13271 1727203820.36666: Set connection var ansible_timeout to 10 13271 1727203820.36680: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203820.36690: Set connection var ansible_pipelining to False 13271 1727203820.36700: Set connection var ansible_shell_executable to /bin/sh 13271 1727203820.36728: variable 'ansible_shell_executable' from source: unknown 13271 1727203820.36737: variable 'ansible_connection' from source: unknown 13271 1727203820.36744: variable 'ansible_module_compression' from source: unknown 13271 1727203820.36751: variable 'ansible_shell_type' from source: unknown 13271 1727203820.36765: variable 'ansible_shell_executable' from source: unknown 13271 1727203820.36872: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203820.36878: variable 'ansible_pipelining' from source: unknown 13271 1727203820.36880: variable 'ansible_timeout' from source: unknown 13271 1727203820.36883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203820.36940: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203820.36957: variable 'omit' from source: magic vars 13271 1727203820.36967: starting attempt loop 13271 1727203820.36981: running the handler 13271 1727203820.37026: handler run complete 13271 1727203820.37043: attempt loop complete, returning result 13271 1727203820.37046: _execute() done 13271 1727203820.37048: dumping result to json 13271 1727203820.37051: done dumping result, returning 13271 1727203820.37058: done running TaskExecutor() for managed-node1/TASK: INIT Prepare setup [028d2410-947f-2a40-12ba-00000000000b] 13271 1727203820.37063: sending task result for task 028d2410-947f-2a40-12ba-00000000000b 13271 1727203820.37139: done sending task result for task 028d2410-947f-2a40-12ba-00000000000b 13271 1727203820.37142: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: ################################################## 13271 1727203820.37230: no more pending results, returning what we have 13271 1727203820.37233: results queue empty 13271 1727203820.37234: checking for any_errors_fatal 13271 1727203820.37235: done checking for any_errors_fatal 13271 1727203820.37236: checking for max_fail_percentage 13271 1727203820.37238: done checking for max_fail_percentage 13271 1727203820.37238: checking to see if all hosts have failed and the running result is not ok 13271 1727203820.37239: done checking to see if all hosts have failed 13271 1727203820.37240: getting the remaining hosts for this loop 13271 1727203820.37241: done getting the remaining hosts for this loop 13271 1727203820.37244: getting the next task for host managed-node1 13271 1727203820.37250: done getting next task for host managed-node1 13271 1727203820.37252: ^ task is: TASK: Install dnsmasq 13271 1727203820.37255: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203820.37258: getting variables 13271 1727203820.37259: in VariableManager get_vars() 13271 1727203820.37298: Calling all_inventory to load vars for managed-node1 13271 1727203820.37300: Calling groups_inventory to load vars for managed-node1 13271 1727203820.37302: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203820.37311: Calling all_plugins_play to load vars for managed-node1 13271 1727203820.37313: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203820.37315: Calling groups_plugins_play to load vars for managed-node1 13271 1727203820.37477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203820.37674: done with get_vars() 13271 1727203820.37685: done getting variables 13271 1727203820.37741: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:50:20 -0400 (0:00:00.028) 0:00:04.020 ***** 13271 1727203820.37772: entering _queue_task() for managed-node1/package 13271 1727203820.38016: worker is 1 (out of 1 available) 13271 1727203820.38026: exiting _queue_task() for managed-node1/package 13271 1727203820.38038: done queuing things up, now waiting for results queue to drain 13271 1727203820.38040: waiting for pending results... 13271 1727203820.38392: running TaskExecutor() for managed-node1/TASK: Install dnsmasq 13271 1727203820.38397: in run() - task 028d2410-947f-2a40-12ba-00000000000f 13271 1727203820.38401: variable 'ansible_search_path' from source: unknown 13271 1727203820.38404: variable 'ansible_search_path' from source: unknown 13271 1727203820.38406: calling self._execute() 13271 1727203820.38453: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203820.38464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203820.38482: variable 'omit' from source: magic vars 13271 1727203820.38834: variable 'ansible_distribution_major_version' from source: facts 13271 1727203820.38857: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203820.38868: variable 'omit' from source: magic vars 13271 1727203820.38918: variable 'omit' from source: magic vars 13271 1727203820.39184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203820.40734: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203820.40786: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203820.40812: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203820.40836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203820.40857: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203820.40926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203820.40945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203820.40965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203820.40992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203820.41003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203820.41073: variable '__network_is_ostree' from source: set_fact 13271 1727203820.41079: variable 'omit' from source: magic vars 13271 1727203820.41101: variable 'omit' from source: magic vars 13271 1727203820.41121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203820.41141: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203820.41156: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203820.41169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203820.41178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203820.41201: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203820.41204: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203820.41207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203820.41270: Set connection var ansible_connection to ssh 13271 1727203820.41277: Set connection var ansible_shell_type to sh 13271 1727203820.41285: Set connection var ansible_timeout to 10 13271 1727203820.41289: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203820.41295: Set connection var ansible_pipelining to False 13271 1727203820.41301: Set connection var ansible_shell_executable to /bin/sh 13271 1727203820.41318: variable 'ansible_shell_executable' from source: unknown 13271 1727203820.41321: variable 'ansible_connection' from source: unknown 13271 1727203820.41323: variable 'ansible_module_compression' from source: unknown 13271 1727203820.41325: variable 'ansible_shell_type' from source: unknown 13271 1727203820.41328: variable 'ansible_shell_executable' from source: unknown 13271 1727203820.41330: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203820.41333: variable 'ansible_pipelining' from source: unknown 13271 1727203820.41335: variable 'ansible_timeout' from source: unknown 13271 1727203820.41341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203820.41426: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203820.41434: variable 'omit' from source: magic vars 13271 1727203820.41439: starting attempt loop 13271 1727203820.41441: running the handler 13271 1727203820.41447: variable 'ansible_facts' from source: unknown 13271 1727203820.41455: variable 'ansible_facts' from source: unknown 13271 1727203820.41483: _low_level_execute_command(): starting 13271 1727203820.41489: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203820.42040: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203820.42044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203820.42046: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203820.42048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203820.42106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203820.42110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203820.42116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203820.42200: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203820.44008: stdout chunk (state=3): >>>/root <<< 13271 1727203820.44110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203820.44135: stderr chunk (state=3): >>><<< 13271 1727203820.44138: stdout chunk (state=3): >>><<< 13271 1727203820.44160: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203820.44171: _low_level_execute_command(): starting 13271 1727203820.44179: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738 `" && echo ansible-tmp-1727203820.4415853-13668-215927693741738="` echo /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738 `" ) && sleep 0' 13271 1727203820.44595: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203820.44598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203820.44603: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203820.44605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203820.44607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203820.44650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203820.44655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203820.44743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203820.47361: stdout chunk (state=3): >>>ansible-tmp-1727203820.4415853-13668-215927693741738=/root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738 <<< 13271 1727203820.47464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203820.47494: stderr chunk (state=3): >>><<< 13271 1727203820.47497: stdout chunk (state=3): >>><<< 13271 1727203820.47514: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203820.4415853-13668-215927693741738=/root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203820.47538: variable 'ansible_module_compression' from source: unknown 13271 1727203820.47586: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 13271 1727203820.47589: ANSIBALLZ: Acquiring lock 13271 1727203820.47592: ANSIBALLZ: Lock acquired: 140497830695696 13271 1727203820.47594: ANSIBALLZ: Creating module 13271 1727203820.58683: ANSIBALLZ: Writing module into payload 13271 1727203820.58722: ANSIBALLZ: Writing module 13271 1727203820.58749: ANSIBALLZ: Renaming module 13271 1727203820.58759: ANSIBALLZ: Done creating module 13271 1727203820.58783: variable 'ansible_facts' from source: unknown 13271 1727203820.58900: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/AnsiballZ_dnf.py 13271 1727203820.59127: Sending initial data 13271 1727203820.59150: Sent initial data (152 bytes) 13271 1727203820.59729: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203820.59794: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203820.59925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203820.61671: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203820.61756: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203820.61935: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp2uxleuvd /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/AnsiballZ_dnf.py <<< 13271 1727203820.61938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/AnsiballZ_dnf.py" <<< 13271 1727203820.62011: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp2uxleuvd" to remote "/root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/AnsiballZ_dnf.py" <<< 13271 1727203820.63129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203820.63250: stderr chunk (state=3): >>><<< 13271 1727203820.63253: stdout chunk (state=3): >>><<< 13271 1727203820.63256: done transferring module to remote 13271 1727203820.63258: _low_level_execute_command(): starting 13271 1727203820.63260: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/ /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/AnsiballZ_dnf.py && sleep 0' 13271 1727203820.63813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203820.63830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203820.63846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203820.63865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203820.63928: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203820.63993: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203820.64013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203820.64055: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203820.64144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203820.66280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203820.66284: stdout chunk (state=3): >>><<< 13271 1727203820.66287: stderr chunk (state=3): >>><<< 13271 1727203820.66432: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203820.66436: _low_level_execute_command(): starting 13271 1727203820.66438: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/AnsiballZ_dnf.py && sleep 0' 13271 1727203820.67543: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203820.67555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203820.67573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203820.67596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203820.67614: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203820.67692: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203820.67719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203820.67742: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203820.67758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203820.67994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203822.58680: stdout chunk (state=3): >>> <<< 13271 1727203822.58871: stdout chunk (state=3): >>>{"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13271 1727203822.66221: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203822.66231: stdout chunk (state=3): >>><<< 13271 1727203822.66689: stderr chunk (state=3): >>><<< 13271 1727203822.66693: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203822.66701: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203822.66704: _low_level_execute_command(): starting 13271 1727203822.66706: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203820.4415853-13668-215927693741738/ > /dev/null 2>&1 && sleep 0' 13271 1727203822.67733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203822.67753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203822.67785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.67854: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203822.67883: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203822.67974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203822.70104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203822.70119: stdout chunk (state=3): >>><<< 13271 1727203822.70138: stderr chunk (state=3): >>><<< 13271 1727203822.70162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203822.70175: handler run complete 13271 1727203822.70430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203822.70747: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203822.70801: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203822.70835: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203822.70882: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203822.70958: variable '__install_status' from source: unknown 13271 1727203822.70995: Evaluated conditional (__install_status is success): True 13271 1727203822.71081: attempt loop complete, returning result 13271 1727203822.71083: _execute() done 13271 1727203822.71087: dumping result to json 13271 1727203822.71089: done dumping result, returning 13271 1727203822.71090: done running TaskExecutor() for managed-node1/TASK: Install dnsmasq [028d2410-947f-2a40-12ba-00000000000f] 13271 1727203822.71092: sending task result for task 028d2410-947f-2a40-12ba-00000000000f 13271 1727203822.71164: done sending task result for task 028d2410-947f-2a40-12ba-00000000000f 13271 1727203822.71166: WORKER PROCESS EXITING changed: [managed-node1] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-3.el10.x86_64" ] } 13271 1727203822.71245: no more pending results, returning what we have 13271 1727203822.71248: results queue empty 13271 1727203822.71249: checking for any_errors_fatal 13271 1727203822.71253: done checking for any_errors_fatal 13271 1727203822.71254: checking for max_fail_percentage 13271 1727203822.71255: done checking for max_fail_percentage 13271 1727203822.71256: checking to see if all hosts have failed and the running result is not ok 13271 1727203822.71257: done checking to see if all hosts have failed 13271 1727203822.71257: getting the remaining hosts for this loop 13271 1727203822.71258: done getting the remaining hosts for this loop 13271 1727203822.71262: getting the next task for host managed-node1 13271 1727203822.71419: done getting next task for host managed-node1 13271 1727203822.71422: ^ task is: TASK: Install pgrep, sysctl 13271 1727203822.71424: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203822.71428: getting variables 13271 1727203822.71429: in VariableManager get_vars() 13271 1727203822.71472: Calling all_inventory to load vars for managed-node1 13271 1727203822.71479: Calling groups_inventory to load vars for managed-node1 13271 1727203822.71481: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203822.71491: Calling all_plugins_play to load vars for managed-node1 13271 1727203822.71493: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203822.71502: Calling groups_plugins_play to load vars for managed-node1 13271 1727203822.71687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203822.72002: done with get_vars() 13271 1727203822.72012: done getting variables 13271 1727203822.72406: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:50:22 -0400 (0:00:02.347) 0:00:06.368 ***** 13271 1727203822.72520: entering _queue_task() for managed-node1/package 13271 1727203822.73318: worker is 1 (out of 1 available) 13271 1727203822.73515: exiting _queue_task() for managed-node1/package 13271 1727203822.73528: done queuing things up, now waiting for results queue to drain 13271 1727203822.73530: waiting for pending results... 13271 1727203822.73786: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 13271 1727203822.73837: in run() - task 028d2410-947f-2a40-12ba-000000000010 13271 1727203822.73847: variable 'ansible_search_path' from source: unknown 13271 1727203822.73851: variable 'ansible_search_path' from source: unknown 13271 1727203822.73883: calling self._execute() 13271 1727203822.73950: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203822.73953: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203822.73960: variable 'omit' from source: magic vars 13271 1727203822.74311: variable 'ansible_distribution_major_version' from source: facts 13271 1727203822.74314: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203822.74401: variable 'ansible_os_family' from source: facts 13271 1727203822.74406: Evaluated conditional (ansible_os_family == 'RedHat'): True 13271 1727203822.74555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203822.74760: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203822.74797: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203822.74844: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203822.74871: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203822.74928: variable 'ansible_distribution_major_version' from source: facts 13271 1727203822.74938: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 13271 1727203822.74941: when evaluation is False, skipping this task 13271 1727203822.74944: _execute() done 13271 1727203822.74946: dumping result to json 13271 1727203822.74948: done dumping result, returning 13271 1727203822.74954: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [028d2410-947f-2a40-12ba-000000000010] 13271 1727203822.74960: sending task result for task 028d2410-947f-2a40-12ba-000000000010 13271 1727203822.75056: done sending task result for task 028d2410-947f-2a40-12ba-000000000010 13271 1727203822.75058: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 13271 1727203822.75111: no more pending results, returning what we have 13271 1727203822.75114: results queue empty 13271 1727203822.75115: checking for any_errors_fatal 13271 1727203822.75123: done checking for any_errors_fatal 13271 1727203822.75123: checking for max_fail_percentage 13271 1727203822.75125: done checking for max_fail_percentage 13271 1727203822.75126: checking to see if all hosts have failed and the running result is not ok 13271 1727203822.75127: done checking to see if all hosts have failed 13271 1727203822.75127: getting the remaining hosts for this loop 13271 1727203822.75129: done getting the remaining hosts for this loop 13271 1727203822.75131: getting the next task for host managed-node1 13271 1727203822.75137: done getting next task for host managed-node1 13271 1727203822.75139: ^ task is: TASK: Install pgrep, sysctl 13271 1727203822.75142: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203822.75145: getting variables 13271 1727203822.75146: in VariableManager get_vars() 13271 1727203822.75180: Calling all_inventory to load vars for managed-node1 13271 1727203822.75183: Calling groups_inventory to load vars for managed-node1 13271 1727203822.75185: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203822.75194: Calling all_plugins_play to load vars for managed-node1 13271 1727203822.75196: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203822.75198: Calling groups_plugins_play to load vars for managed-node1 13271 1727203822.75330: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203822.75448: done with get_vars() 13271 1727203822.75458: done getting variables 13271 1727203822.75503: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:50:22 -0400 (0:00:00.030) 0:00:06.398 ***** 13271 1727203822.75543: entering _queue_task() for managed-node1/package 13271 1727203822.75826: worker is 1 (out of 1 available) 13271 1727203822.75839: exiting _queue_task() for managed-node1/package 13271 1727203822.75850: done queuing things up, now waiting for results queue to drain 13271 1727203822.75852: waiting for pending results... 13271 1727203822.76294: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 13271 1727203822.76299: in run() - task 028d2410-947f-2a40-12ba-000000000011 13271 1727203822.76302: variable 'ansible_search_path' from source: unknown 13271 1727203822.76304: variable 'ansible_search_path' from source: unknown 13271 1727203822.76307: calling self._execute() 13271 1727203822.76367: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203822.76381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203822.76394: variable 'omit' from source: magic vars 13271 1727203822.76880: variable 'ansible_distribution_major_version' from source: facts 13271 1727203822.76912: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203822.77022: variable 'ansible_os_family' from source: facts 13271 1727203822.77026: Evaluated conditional (ansible_os_family == 'RedHat'): True 13271 1727203822.77148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203822.77354: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203822.77388: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203822.77413: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203822.77444: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203822.77499: variable 'ansible_distribution_major_version' from source: facts 13271 1727203822.77509: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 13271 1727203822.77514: variable 'omit' from source: magic vars 13271 1727203822.77551: variable 'omit' from source: magic vars 13271 1727203822.77655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203822.79007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203822.79028: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203822.79057: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203822.79231: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203822.79255: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203822.79579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203822.79584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203822.79587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203822.79589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203822.79591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203822.79593: variable '__network_is_ostree' from source: set_fact 13271 1727203822.79595: variable 'omit' from source: magic vars 13271 1727203822.79597: variable 'omit' from source: magic vars 13271 1727203822.79602: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203822.79629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203822.79649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203822.79665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203822.79678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203822.79715: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203822.79718: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203822.79720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203822.79820: Set connection var ansible_connection to ssh 13271 1727203822.79828: Set connection var ansible_shell_type to sh 13271 1727203822.79836: Set connection var ansible_timeout to 10 13271 1727203822.79841: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203822.79847: Set connection var ansible_pipelining to False 13271 1727203822.79853: Set connection var ansible_shell_executable to /bin/sh 13271 1727203822.79879: variable 'ansible_shell_executable' from source: unknown 13271 1727203822.79883: variable 'ansible_connection' from source: unknown 13271 1727203822.79885: variable 'ansible_module_compression' from source: unknown 13271 1727203822.79887: variable 'ansible_shell_type' from source: unknown 13271 1727203822.79889: variable 'ansible_shell_executable' from source: unknown 13271 1727203822.79892: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203822.79897: variable 'ansible_pipelining' from source: unknown 13271 1727203822.79899: variable 'ansible_timeout' from source: unknown 13271 1727203822.79903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203822.80016: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203822.80034: variable 'omit' from source: magic vars 13271 1727203822.80047: starting attempt loop 13271 1727203822.80056: running the handler 13271 1727203822.80149: variable 'ansible_facts' from source: unknown 13271 1727203822.80152: variable 'ansible_facts' from source: unknown 13271 1727203822.80154: _low_level_execute_command(): starting 13271 1727203822.80156: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203822.80983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203822.81085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.81166: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203822.81170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203822.81210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203822.81331: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203822.83144: stdout chunk (state=3): >>>/root <<< 13271 1727203822.83276: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203822.83280: stdout chunk (state=3): >>><<< 13271 1727203822.83289: stderr chunk (state=3): >>><<< 13271 1727203822.83316: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203822.83325: _low_level_execute_command(): starting 13271 1727203822.83331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037 `" && echo ansible-tmp-1727203822.8331368-13827-237583002143037="` echo /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037 `" ) && sleep 0' 13271 1727203822.83887: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203822.83890: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203822.83892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.83900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203822.83912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.83979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203822.83983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203822.84002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203822.84090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203822.86201: stdout chunk (state=3): >>>ansible-tmp-1727203822.8331368-13827-237583002143037=/root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037 <<< 13271 1727203822.86308: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203822.86338: stderr chunk (state=3): >>><<< 13271 1727203822.86341: stdout chunk (state=3): >>><<< 13271 1727203822.86352: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203822.8331368-13827-237583002143037=/root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203822.86419: variable 'ansible_module_compression' from source: unknown 13271 1727203822.86430: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 13271 1727203822.86466: variable 'ansible_facts' from source: unknown 13271 1727203822.86545: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/AnsiballZ_dnf.py 13271 1727203822.86642: Sending initial data 13271 1727203822.86646: Sent initial data (152 bytes) 13271 1727203822.87074: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203822.87108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203822.87111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.87113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 13271 1727203822.87115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203822.87117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.87165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203822.87171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203822.87253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203822.89025: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13271 1727203822.89057: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203822.89133: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203822.89242: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpdymmf8eb /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/AnsiballZ_dnf.py <<< 13271 1727203822.89245: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/AnsiballZ_dnf.py" <<< 13271 1727203822.89371: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpdymmf8eb" to remote "/root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/AnsiballZ_dnf.py" <<< 13271 1727203822.91326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203822.91365: stderr chunk (state=3): >>><<< 13271 1727203822.91385: stdout chunk (state=3): >>><<< 13271 1727203822.91492: done transferring module to remote 13271 1727203822.91495: _low_level_execute_command(): starting 13271 1727203822.91497: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/ /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/AnsiballZ_dnf.py && sleep 0' 13271 1727203822.92219: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203822.92233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203822.92292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.92396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203822.92400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203822.92403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203822.92541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203822.94501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203822.94517: stderr chunk (state=3): >>><<< 13271 1727203822.94520: stdout chunk (state=3): >>><<< 13271 1727203822.94534: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203822.94537: _low_level_execute_command(): starting 13271 1727203822.94542: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/AnsiballZ_dnf.py && sleep 0' 13271 1727203822.94983: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203822.94986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203822.94989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203822.94991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203822.94993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203822.95048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203822.95052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203822.95131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203823.41877: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 13271 1727203823.47324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203823.47349: stderr chunk (state=3): >>><<< 13271 1727203823.47351: stdout chunk (state=3): >>><<< 13271 1727203823.47368: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203823.47408: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203823.47413: _low_level_execute_command(): starting 13271 1727203823.47418: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203822.8331368-13827-237583002143037/ > /dev/null 2>&1 && sleep 0' 13271 1727203823.48096: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203823.48113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203823.48124: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203823.48200: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203823.48230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203823.48246: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203823.48271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203823.48392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203823.50472: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203823.50479: stdout chunk (state=3): >>><<< 13271 1727203823.50482: stderr chunk (state=3): >>><<< 13271 1727203823.50681: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203823.50685: handler run complete 13271 1727203823.50687: attempt loop complete, returning result 13271 1727203823.50689: _execute() done 13271 1727203823.50691: dumping result to json 13271 1727203823.50693: done dumping result, returning 13271 1727203823.50695: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [028d2410-947f-2a40-12ba-000000000011] 13271 1727203823.50697: sending task result for task 028d2410-947f-2a40-12ba-000000000011 13271 1727203823.50768: done sending task result for task 028d2410-947f-2a40-12ba-000000000011 13271 1727203823.50771: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 13271 1727203823.50856: no more pending results, returning what we have 13271 1727203823.50861: results queue empty 13271 1727203823.50862: checking for any_errors_fatal 13271 1727203823.50869: done checking for any_errors_fatal 13271 1727203823.50870: checking for max_fail_percentage 13271 1727203823.50871: done checking for max_fail_percentage 13271 1727203823.50872: checking to see if all hosts have failed and the running result is not ok 13271 1727203823.50874: done checking to see if all hosts have failed 13271 1727203823.50874: getting the remaining hosts for this loop 13271 1727203823.50877: done getting the remaining hosts for this loop 13271 1727203823.50881: getting the next task for host managed-node1 13271 1727203823.50888: done getting next task for host managed-node1 13271 1727203823.50891: ^ task is: TASK: Create test interfaces 13271 1727203823.50894: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203823.50897: getting variables 13271 1727203823.50899: in VariableManager get_vars() 13271 1727203823.50941: Calling all_inventory to load vars for managed-node1 13271 1727203823.50943: Calling groups_inventory to load vars for managed-node1 13271 1727203823.50947: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203823.50957: Calling all_plugins_play to load vars for managed-node1 13271 1727203823.50963: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203823.50967: Calling groups_plugins_play to load vars for managed-node1 13271 1727203823.52175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203823.52894: done with get_vars() 13271 1727203823.52907: done getting variables 13271 1727203823.53065: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:50:23 -0400 (0:00:00.777) 0:00:07.176 ***** 13271 1727203823.53298: entering _queue_task() for managed-node1/shell 13271 1727203823.53300: Creating lock for shell 13271 1727203823.53969: worker is 1 (out of 1 available) 13271 1727203823.54086: exiting _queue_task() for managed-node1/shell 13271 1727203823.54105: done queuing things up, now waiting for results queue to drain 13271 1727203823.54107: waiting for pending results... 13271 1727203823.54395: running TaskExecutor() for managed-node1/TASK: Create test interfaces 13271 1727203823.54430: in run() - task 028d2410-947f-2a40-12ba-000000000012 13271 1727203823.54452: variable 'ansible_search_path' from source: unknown 13271 1727203823.54464: variable 'ansible_search_path' from source: unknown 13271 1727203823.54533: calling self._execute() 13271 1727203823.54611: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203823.54640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203823.54730: variable 'omit' from source: magic vars 13271 1727203823.55083: variable 'ansible_distribution_major_version' from source: facts 13271 1727203823.55100: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203823.55111: variable 'omit' from source: magic vars 13271 1727203823.55157: variable 'omit' from source: magic vars 13271 1727203823.55562: variable 'dhcp_interface1' from source: play vars 13271 1727203823.55573: variable 'dhcp_interface2' from source: play vars 13271 1727203823.55626: variable 'omit' from source: magic vars 13271 1727203823.55670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203823.55715: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203823.55751: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203823.55773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203823.55793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203823.55840: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203823.55850: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203823.55880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203823.55959: Set connection var ansible_connection to ssh 13271 1727203823.55972: Set connection var ansible_shell_type to sh 13271 1727203823.55987: Set connection var ansible_timeout to 10 13271 1727203823.56032: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203823.56035: Set connection var ansible_pipelining to False 13271 1727203823.56037: Set connection var ansible_shell_executable to /bin/sh 13271 1727203823.56040: variable 'ansible_shell_executable' from source: unknown 13271 1727203823.56056: variable 'ansible_connection' from source: unknown 13271 1727203823.56065: variable 'ansible_module_compression' from source: unknown 13271 1727203823.56071: variable 'ansible_shell_type' from source: unknown 13271 1727203823.56079: variable 'ansible_shell_executable' from source: unknown 13271 1727203823.56085: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203823.56091: variable 'ansible_pipelining' from source: unknown 13271 1727203823.56096: variable 'ansible_timeout' from source: unknown 13271 1727203823.56142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203823.56254: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203823.56281: variable 'omit' from source: magic vars 13271 1727203823.56291: starting attempt loop 13271 1727203823.56297: running the handler 13271 1727203823.56310: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203823.56332: _low_level_execute_command(): starting 13271 1727203823.56343: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203823.57117: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203823.57133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203823.57257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203823.57282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203823.57432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203823.59278: stdout chunk (state=3): >>>/root <<< 13271 1727203823.59390: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203823.59662: stderr chunk (state=3): >>><<< 13271 1727203823.59665: stdout chunk (state=3): >>><<< 13271 1727203823.59669: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203823.59672: _low_level_execute_command(): starting 13271 1727203823.59675: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854 `" && echo ansible-tmp-1727203823.5958335-13861-261775827693854="` echo /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854 `" ) && sleep 0' 13271 1727203823.60817: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203823.60892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203823.60904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203823.60919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203823.60931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203823.60939: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203823.60948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203823.60964: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203823.61161: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203823.61202: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203823.61338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203823.63451: stdout chunk (state=3): >>>ansible-tmp-1727203823.5958335-13861-261775827693854=/root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854 <<< 13271 1727203823.63632: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203823.63636: stdout chunk (state=3): >>><<< 13271 1727203823.63644: stderr chunk (state=3): >>><<< 13271 1727203823.63665: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203823.5958335-13861-261775827693854=/root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203823.63879: variable 'ansible_module_compression' from source: unknown 13271 1727203823.63884: ANSIBALLZ: Using generic lock for ansible.legacy.command 13271 1727203823.63886: ANSIBALLZ: Acquiring lock 13271 1727203823.63888: ANSIBALLZ: Lock acquired: 140497830695696 13271 1727203823.63890: ANSIBALLZ: Creating module 13271 1727203823.82188: ANSIBALLZ: Writing module into payload 13271 1727203823.82263: ANSIBALLZ: Writing module 13271 1727203823.82282: ANSIBALLZ: Renaming module 13271 1727203823.82301: ANSIBALLZ: Done creating module 13271 1727203823.82374: variable 'ansible_facts' from source: unknown 13271 1727203823.82389: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/AnsiballZ_command.py 13271 1727203823.82602: Sending initial data 13271 1727203823.82605: Sent initial data (156 bytes) 13271 1727203823.83699: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203823.83710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203823.83982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203823.85683: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13271 1727203823.85691: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13271 1727203823.85698: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13271 1727203823.85763: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 13271 1727203823.85766: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 13271 1727203823.85768: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 13271 1727203823.85770: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 13271 1727203823.85772: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 13271 1727203823.85774: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 13271 1727203823.85777: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 13271 1727203823.85779: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" <<< 13271 1727203823.85781: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203823.85948: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203823.86332: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpuo6gnbr3 /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/AnsiballZ_command.py <<< 13271 1727203823.86335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/AnsiballZ_command.py" <<< 13271 1727203823.86386: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpuo6gnbr3" to remote "/root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/AnsiballZ_command.py" <<< 13271 1727203823.87737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203823.87928: stderr chunk (state=3): >>><<< 13271 1727203823.87931: stdout chunk (state=3): >>><<< 13271 1727203823.87933: done transferring module to remote 13271 1727203823.87935: _low_level_execute_command(): starting 13271 1727203823.87941: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/ /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/AnsiballZ_command.py && sleep 0' 13271 1727203823.89189: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203823.89193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203823.89213: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203823.89294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203823.91296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203823.91300: stdout chunk (state=3): >>><<< 13271 1727203823.91306: stderr chunk (state=3): >>><<< 13271 1727203823.91339: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203823.91342: _low_level_execute_command(): starting 13271 1727203823.91348: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/AnsiballZ_command.py && sleep 0' 13271 1727203823.92250: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203823.92269: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203823.92272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203823.92291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203823.92481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203823.92485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203823.92487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203823.92566: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.47370: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 13271 1727203825.47390: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:50:24.089580", "end": "2024-09-24 14:50:25.471636", "delta": "0:00:01.382056", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203825.49382: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203825.49386: stdout chunk (state=3): >>><<< 13271 1727203825.49389: stderr chunk (state=3): >>><<< 13271 1727203825.49392: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 705 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:50:24.089580", "end": "2024-09-24 14:50:25.471636", "delta": "0:00:01.382056", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203825.49400: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203825.49403: _low_level_execute_command(): starting 13271 1727203825.49405: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203823.5958335-13861-261775827693854/ > /dev/null 2>&1 && sleep 0' 13271 1727203825.49870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203825.49887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203825.49899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203825.49941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203825.49966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203825.50036: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.52129: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203825.52154: stderr chunk (state=3): >>><<< 13271 1727203825.52157: stdout chunk (state=3): >>><<< 13271 1727203825.52185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203825.52189: handler run complete 13271 1727203825.52222: Evaluated conditional (False): False 13271 1727203825.52230: attempt loop complete, returning result 13271 1727203825.52233: _execute() done 13271 1727203825.52235: dumping result to json 13271 1727203825.52247: done dumping result, returning 13271 1727203825.52250: done running TaskExecutor() for managed-node1/TASK: Create test interfaces [028d2410-947f-2a40-12ba-000000000012] 13271 1727203825.52253: sending task result for task 028d2410-947f-2a40-12ba-000000000012 ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.382056", "end": "2024-09-24 14:50:25.471636", "rc": 0, "start": "2024-09-24 14:50:24.089580" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 705 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 13271 1727203825.52442: no more pending results, returning what we have 13271 1727203825.52446: results queue empty 13271 1727203825.52446: checking for any_errors_fatal 13271 1727203825.52454: done checking for any_errors_fatal 13271 1727203825.52454: checking for max_fail_percentage 13271 1727203825.52456: done checking for max_fail_percentage 13271 1727203825.52457: checking to see if all hosts have failed and the running result is not ok 13271 1727203825.52458: done checking to see if all hosts have failed 13271 1727203825.52459: getting the remaining hosts for this loop 13271 1727203825.52462: done getting the remaining hosts for this loop 13271 1727203825.52465: getting the next task for host managed-node1 13271 1727203825.52472: done getting next task for host managed-node1 13271 1727203825.52478: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13271 1727203825.52481: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203825.52484: getting variables 13271 1727203825.52485: in VariableManager get_vars() 13271 1727203825.52521: Calling all_inventory to load vars for managed-node1 13271 1727203825.52524: Calling groups_inventory to load vars for managed-node1 13271 1727203825.52526: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203825.52536: Calling all_plugins_play to load vars for managed-node1 13271 1727203825.52539: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203825.52541: Calling groups_plugins_play to load vars for managed-node1 13271 1727203825.52688: done sending task result for task 028d2410-947f-2a40-12ba-000000000012 13271 1727203825.52692: WORKER PROCESS EXITING 13271 1727203825.52702: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203825.52850: done with get_vars() 13271 1727203825.52857: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:50:25 -0400 (0:00:01.996) 0:00:09.172 ***** 13271 1727203825.52928: entering _queue_task() for managed-node1/include_tasks 13271 1727203825.53123: worker is 1 (out of 1 available) 13271 1727203825.53136: exiting _queue_task() for managed-node1/include_tasks 13271 1727203825.53148: done queuing things up, now waiting for results queue to drain 13271 1727203825.53150: waiting for pending results... 13271 1727203825.53301: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 13271 1727203825.53373: in run() - task 028d2410-947f-2a40-12ba-000000000016 13271 1727203825.53388: variable 'ansible_search_path' from source: unknown 13271 1727203825.53391: variable 'ansible_search_path' from source: unknown 13271 1727203825.53417: calling self._execute() 13271 1727203825.53479: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.53483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.53500: variable 'omit' from source: magic vars 13271 1727203825.53819: variable 'ansible_distribution_major_version' from source: facts 13271 1727203825.53822: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203825.53826: _execute() done 13271 1727203825.53828: dumping result to json 13271 1727203825.53831: done dumping result, returning 13271 1727203825.53833: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-2a40-12ba-000000000016] 13271 1727203825.53835: sending task result for task 028d2410-947f-2a40-12ba-000000000016 13271 1727203825.54051: done sending task result for task 028d2410-947f-2a40-12ba-000000000016 13271 1727203825.54053: WORKER PROCESS EXITING 13271 1727203825.54218: no more pending results, returning what we have 13271 1727203825.54221: in VariableManager get_vars() 13271 1727203825.54256: Calling all_inventory to load vars for managed-node1 13271 1727203825.54258: Calling groups_inventory to load vars for managed-node1 13271 1727203825.54260: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203825.54268: Calling all_plugins_play to load vars for managed-node1 13271 1727203825.54270: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203825.54273: Calling groups_plugins_play to load vars for managed-node1 13271 1727203825.54558: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203825.54756: done with get_vars() 13271 1727203825.54768: variable 'ansible_search_path' from source: unknown 13271 1727203825.54769: variable 'ansible_search_path' from source: unknown 13271 1727203825.54810: we have included files to process 13271 1727203825.54811: generating all_blocks data 13271 1727203825.54813: done generating all_blocks data 13271 1727203825.54813: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203825.54814: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203825.54817: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203825.55043: done processing included file 13271 1727203825.55044: iterating over new_blocks loaded from include file 13271 1727203825.55046: in VariableManager get_vars() 13271 1727203825.55067: done with get_vars() 13271 1727203825.55069: filtering new block on tags 13271 1727203825.55087: done filtering new block on tags 13271 1727203825.55089: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 13271 1727203825.55094: extending task lists for all hosts with included blocks 13271 1727203825.55193: done extending task lists 13271 1727203825.55195: done processing included files 13271 1727203825.55195: results queue empty 13271 1727203825.55196: checking for any_errors_fatal 13271 1727203825.55202: done checking for any_errors_fatal 13271 1727203825.55203: checking for max_fail_percentage 13271 1727203825.55204: done checking for max_fail_percentage 13271 1727203825.55205: checking to see if all hosts have failed and the running result is not ok 13271 1727203825.55206: done checking to see if all hosts have failed 13271 1727203825.55207: getting the remaining hosts for this loop 13271 1727203825.55208: done getting the remaining hosts for this loop 13271 1727203825.55210: getting the next task for host managed-node1 13271 1727203825.55214: done getting next task for host managed-node1 13271 1727203825.55216: ^ task is: TASK: Get stat for interface {{ interface }} 13271 1727203825.55218: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203825.55220: getting variables 13271 1727203825.55221: in VariableManager get_vars() 13271 1727203825.55233: Calling all_inventory to load vars for managed-node1 13271 1727203825.55235: Calling groups_inventory to load vars for managed-node1 13271 1727203825.55237: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203825.55242: Calling all_plugins_play to load vars for managed-node1 13271 1727203825.55244: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203825.55247: Calling groups_plugins_play to load vars for managed-node1 13271 1727203825.55417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203825.55613: done with get_vars() 13271 1727203825.55622: done getting variables 13271 1727203825.55781: variable 'interface' from source: task vars 13271 1727203825.55785: variable 'dhcp_interface1' from source: play vars 13271 1727203825.55845: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:50:25 -0400 (0:00:00.029) 0:00:09.202 ***** 13271 1727203825.55891: entering _queue_task() for managed-node1/stat 13271 1727203825.56148: worker is 1 (out of 1 available) 13271 1727203825.56164: exiting _queue_task() for managed-node1/stat 13271 1727203825.56380: done queuing things up, now waiting for results queue to drain 13271 1727203825.56383: waiting for pending results... 13271 1727203825.56428: running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 13271 1727203825.56552: in run() - task 028d2410-947f-2a40-12ba-000000000152 13271 1727203825.56572: variable 'ansible_search_path' from source: unknown 13271 1727203825.56605: variable 'ansible_search_path' from source: unknown 13271 1727203825.56623: calling self._execute() 13271 1727203825.56703: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.56718: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.56780: variable 'omit' from source: magic vars 13271 1727203825.57070: variable 'ansible_distribution_major_version' from source: facts 13271 1727203825.57087: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203825.57097: variable 'omit' from source: magic vars 13271 1727203825.57156: variable 'omit' from source: magic vars 13271 1727203825.57262: variable 'interface' from source: task vars 13271 1727203825.57272: variable 'dhcp_interface1' from source: play vars 13271 1727203825.57337: variable 'dhcp_interface1' from source: play vars 13271 1727203825.57365: variable 'omit' from source: magic vars 13271 1727203825.57409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203825.57447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203825.57483: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203825.57506: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203825.57522: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203825.57555: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203825.57568: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.57578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.57678: Set connection var ansible_connection to ssh 13271 1727203825.57780: Set connection var ansible_shell_type to sh 13271 1727203825.57783: Set connection var ansible_timeout to 10 13271 1727203825.57786: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203825.57788: Set connection var ansible_pipelining to False 13271 1727203825.57790: Set connection var ansible_shell_executable to /bin/sh 13271 1727203825.57792: variable 'ansible_shell_executable' from source: unknown 13271 1727203825.57794: variable 'ansible_connection' from source: unknown 13271 1727203825.57797: variable 'ansible_module_compression' from source: unknown 13271 1727203825.57799: variable 'ansible_shell_type' from source: unknown 13271 1727203825.57801: variable 'ansible_shell_executable' from source: unknown 13271 1727203825.57803: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.57805: variable 'ansible_pipelining' from source: unknown 13271 1727203825.57807: variable 'ansible_timeout' from source: unknown 13271 1727203825.57810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.58032: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203825.58036: variable 'omit' from source: magic vars 13271 1727203825.58038: starting attempt loop 13271 1727203825.58040: running the handler 13271 1727203825.58042: _low_level_execute_command(): starting 13271 1727203825.58043: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203825.58848: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203825.58870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203825.58919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203825.58938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203825.59025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203825.59041: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203825.59063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203825.59088: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203825.59208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.61000: stdout chunk (state=3): >>>/root <<< 13271 1727203825.61154: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203825.61161: stdout chunk (state=3): >>><<< 13271 1727203825.61164: stderr chunk (state=3): >>><<< 13271 1727203825.61283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203825.61287: _low_level_execute_command(): starting 13271 1727203825.61290: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787 `" && echo ansible-tmp-1727203825.6119313-13952-221949915279787="` echo /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787 `" ) && sleep 0' 13271 1727203825.61877: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203825.61892: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203825.61908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203825.62042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203825.62066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203825.62182: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.64510: stdout chunk (state=3): >>>ansible-tmp-1727203825.6119313-13952-221949915279787=/root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787 <<< 13271 1727203825.64514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203825.64517: stdout chunk (state=3): >>><<< 13271 1727203825.64519: stderr chunk (state=3): >>><<< 13271 1727203825.64522: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203825.6119313-13952-221949915279787=/root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203825.64571: variable 'ansible_module_compression' from source: unknown 13271 1727203825.64647: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13271 1727203825.64695: variable 'ansible_facts' from source: unknown 13271 1727203825.64835: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/AnsiballZ_stat.py 13271 1727203825.64958: Sending initial data 13271 1727203825.64988: Sent initial data (153 bytes) 13271 1727203825.65700: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203825.65781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203825.65794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203825.65830: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203825.65850: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203825.65897: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203825.65969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.67715: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203825.67806: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203825.67909: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpktn49yk5 /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/AnsiballZ_stat.py <<< 13271 1727203825.67919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/AnsiballZ_stat.py" <<< 13271 1727203825.67994: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpktn49yk5" to remote "/root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/AnsiballZ_stat.py" <<< 13271 1727203825.68909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203825.68920: stderr chunk (state=3): >>><<< 13271 1727203825.68928: stdout chunk (state=3): >>><<< 13271 1727203825.68991: done transferring module to remote 13271 1727203825.69050: _low_level_execute_command(): starting 13271 1727203825.69055: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/ /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/AnsiballZ_stat.py && sleep 0' 13271 1727203825.69666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203825.69681: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203825.69712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203825.69816: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203825.69838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203825.69856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203825.69965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.71974: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203825.71997: stdout chunk (state=3): >>><<< 13271 1727203825.72015: stderr chunk (state=3): >>><<< 13271 1727203825.72039: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203825.72046: _low_level_execute_command(): starting 13271 1727203825.72124: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/AnsiballZ_stat.py && sleep 0' 13271 1727203825.72665: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203825.72689: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203825.72786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203825.72810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203825.72926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.89474: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26491, "dev": 23, "nlink": 1, "atime": 1727203824.0973575, "mtime": 1727203824.0973575, "ctime": 1727203824.0973575, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13271 1727203825.91182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203825.91186: stdout chunk (state=3): >>><<< 13271 1727203825.91189: stderr chunk (state=3): >>><<< 13271 1727203825.91191: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26491, "dev": 23, "nlink": 1, "atime": 1727203824.0973575, "mtime": 1727203824.0973575, "ctime": 1727203824.0973575, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203825.91193: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203825.91196: _low_level_execute_command(): starting 13271 1727203825.91198: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203825.6119313-13952-221949915279787/ > /dev/null 2>&1 && sleep 0' 13271 1727203825.91778: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203825.91786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203825.91798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203825.91812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203825.91825: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203825.91839: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203825.91850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203825.91875: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203825.91882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203825.91955: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203825.91984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203825.91987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203825.92019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203825.92100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203825.94099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203825.94135: stderr chunk (state=3): >>><<< 13271 1727203825.94138: stdout chunk (state=3): >>><<< 13271 1727203825.94164: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203825.94168: handler run complete 13271 1727203825.94215: attempt loop complete, returning result 13271 1727203825.94218: _execute() done 13271 1727203825.94220: dumping result to json 13271 1727203825.94254: done dumping result, returning 13271 1727203825.94257: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 [028d2410-947f-2a40-12ba-000000000152] 13271 1727203825.94261: sending task result for task 028d2410-947f-2a40-12ba-000000000152 13271 1727203825.94354: done sending task result for task 028d2410-947f-2a40-12ba-000000000152 13271 1727203825.94356: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727203824.0973575, "block_size": 4096, "blocks": 0, "ctime": 1727203824.0973575, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26491, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727203824.0973575, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13271 1727203825.94481: no more pending results, returning what we have 13271 1727203825.94484: results queue empty 13271 1727203825.94485: checking for any_errors_fatal 13271 1727203825.94486: done checking for any_errors_fatal 13271 1727203825.94487: checking for max_fail_percentage 13271 1727203825.94488: done checking for max_fail_percentage 13271 1727203825.94489: checking to see if all hosts have failed and the running result is not ok 13271 1727203825.94490: done checking to see if all hosts have failed 13271 1727203825.94491: getting the remaining hosts for this loop 13271 1727203825.94492: done getting the remaining hosts for this loop 13271 1727203825.94495: getting the next task for host managed-node1 13271 1727203825.94502: done getting next task for host managed-node1 13271 1727203825.94504: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13271 1727203825.94507: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203825.94511: getting variables 13271 1727203825.94512: in VariableManager get_vars() 13271 1727203825.94545: Calling all_inventory to load vars for managed-node1 13271 1727203825.94548: Calling groups_inventory to load vars for managed-node1 13271 1727203825.94550: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203825.94561: Calling all_plugins_play to load vars for managed-node1 13271 1727203825.94564: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203825.94566: Calling groups_plugins_play to load vars for managed-node1 13271 1727203825.94743: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203825.94943: done with get_vars() 13271 1727203825.94954: done getting variables 13271 1727203825.95049: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 13271 1727203825.95163: variable 'interface' from source: task vars 13271 1727203825.95168: variable 'dhcp_interface1' from source: play vars 13271 1727203825.95214: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:50:25 -0400 (0:00:00.393) 0:00:09.595 ***** 13271 1727203825.95249: entering _queue_task() for managed-node1/assert 13271 1727203825.95254: Creating lock for assert 13271 1727203825.95505: worker is 1 (out of 1 available) 13271 1727203825.95518: exiting _queue_task() for managed-node1/assert 13271 1727203825.95530: done queuing things up, now waiting for results queue to drain 13271 1727203825.95532: waiting for pending results... 13271 1727203825.95892: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' 13271 1727203825.95897: in run() - task 028d2410-947f-2a40-12ba-000000000017 13271 1727203825.95914: variable 'ansible_search_path' from source: unknown 13271 1727203825.95920: variable 'ansible_search_path' from source: unknown 13271 1727203825.95956: calling self._execute() 13271 1727203825.96040: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.96051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.96068: variable 'omit' from source: magic vars 13271 1727203825.96494: variable 'ansible_distribution_major_version' from source: facts 13271 1727203825.96541: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203825.96546: variable 'omit' from source: magic vars 13271 1727203825.96562: variable 'omit' from source: magic vars 13271 1727203825.96627: variable 'interface' from source: task vars 13271 1727203825.96631: variable 'dhcp_interface1' from source: play vars 13271 1727203825.96683: variable 'dhcp_interface1' from source: play vars 13271 1727203825.96697: variable 'omit' from source: magic vars 13271 1727203825.96728: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203825.96763: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203825.96773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203825.96788: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203825.96798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203825.96821: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203825.96824: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.96826: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.96895: Set connection var ansible_connection to ssh 13271 1727203825.96902: Set connection var ansible_shell_type to sh 13271 1727203825.96909: Set connection var ansible_timeout to 10 13271 1727203825.96914: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203825.96919: Set connection var ansible_pipelining to False 13271 1727203825.96924: Set connection var ansible_shell_executable to /bin/sh 13271 1727203825.96941: variable 'ansible_shell_executable' from source: unknown 13271 1727203825.96944: variable 'ansible_connection' from source: unknown 13271 1727203825.96946: variable 'ansible_module_compression' from source: unknown 13271 1727203825.96949: variable 'ansible_shell_type' from source: unknown 13271 1727203825.96951: variable 'ansible_shell_executable' from source: unknown 13271 1727203825.96953: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.96956: variable 'ansible_pipelining' from source: unknown 13271 1727203825.96961: variable 'ansible_timeout' from source: unknown 13271 1727203825.96964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.97062: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203825.97069: variable 'omit' from source: magic vars 13271 1727203825.97074: starting attempt loop 13271 1727203825.97078: running the handler 13271 1727203825.97165: variable 'interface_stat' from source: set_fact 13271 1727203825.97178: Evaluated conditional (interface_stat.stat.exists): True 13271 1727203825.97184: handler run complete 13271 1727203825.97197: attempt loop complete, returning result 13271 1727203825.97201: _execute() done 13271 1727203825.97204: dumping result to json 13271 1727203825.97206: done dumping result, returning 13271 1727203825.97208: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' [028d2410-947f-2a40-12ba-000000000017] 13271 1727203825.97216: sending task result for task 028d2410-947f-2a40-12ba-000000000017 13271 1727203825.97291: done sending task result for task 028d2410-947f-2a40-12ba-000000000017 13271 1727203825.97293: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203825.97362: no more pending results, returning what we have 13271 1727203825.97365: results queue empty 13271 1727203825.97366: checking for any_errors_fatal 13271 1727203825.97373: done checking for any_errors_fatal 13271 1727203825.97374: checking for max_fail_percentage 13271 1727203825.97377: done checking for max_fail_percentage 13271 1727203825.97378: checking to see if all hosts have failed and the running result is not ok 13271 1727203825.97379: done checking to see if all hosts have failed 13271 1727203825.97380: getting the remaining hosts for this loop 13271 1727203825.97381: done getting the remaining hosts for this loop 13271 1727203825.97384: getting the next task for host managed-node1 13271 1727203825.97391: done getting next task for host managed-node1 13271 1727203825.97393: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13271 1727203825.97396: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203825.97399: getting variables 13271 1727203825.97400: in VariableManager get_vars() 13271 1727203825.97432: Calling all_inventory to load vars for managed-node1 13271 1727203825.97435: Calling groups_inventory to load vars for managed-node1 13271 1727203825.97437: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203825.97446: Calling all_plugins_play to load vars for managed-node1 13271 1727203825.97448: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203825.97451: Calling groups_plugins_play to load vars for managed-node1 13271 1727203825.97595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203825.97709: done with get_vars() 13271 1727203825.97716: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:50:25 -0400 (0:00:00.025) 0:00:09.620 ***** 13271 1727203825.97786: entering _queue_task() for managed-node1/include_tasks 13271 1727203825.98027: worker is 1 (out of 1 available) 13271 1727203825.98040: exiting _queue_task() for managed-node1/include_tasks 13271 1727203825.98053: done queuing things up, now waiting for results queue to drain 13271 1727203825.98055: waiting for pending results... 13271 1727203825.98421: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 13271 1727203825.98694: in run() - task 028d2410-947f-2a40-12ba-00000000001b 13271 1727203825.98698: variable 'ansible_search_path' from source: unknown 13271 1727203825.98700: variable 'ansible_search_path' from source: unknown 13271 1727203825.98702: calling self._execute() 13271 1727203825.98823: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203825.98836: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203825.98851: variable 'omit' from source: magic vars 13271 1727203825.99236: variable 'ansible_distribution_major_version' from source: facts 13271 1727203825.99251: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203825.99264: _execute() done 13271 1727203825.99273: dumping result to json 13271 1727203825.99284: done dumping result, returning 13271 1727203825.99294: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-2a40-12ba-00000000001b] 13271 1727203825.99303: sending task result for task 028d2410-947f-2a40-12ba-00000000001b 13271 1727203825.99510: no more pending results, returning what we have 13271 1727203825.99515: in VariableManager get_vars() 13271 1727203825.99556: Calling all_inventory to load vars for managed-node1 13271 1727203825.99559: Calling groups_inventory to load vars for managed-node1 13271 1727203825.99564: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203825.99574: Calling all_plugins_play to load vars for managed-node1 13271 1727203825.99590: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203825.99609: done sending task result for task 028d2410-947f-2a40-12ba-00000000001b 13271 1727203825.99613: WORKER PROCESS EXITING 13271 1727203825.99617: Calling groups_plugins_play to load vars for managed-node1 13271 1727203825.99766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.00051: done with get_vars() 13271 1727203826.00056: variable 'ansible_search_path' from source: unknown 13271 1727203826.00057: variable 'ansible_search_path' from source: unknown 13271 1727203826.00082: we have included files to process 13271 1727203826.00083: generating all_blocks data 13271 1727203826.00084: done generating all_blocks data 13271 1727203826.00086: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203826.00086: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203826.00088: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203826.00202: done processing included file 13271 1727203826.00204: iterating over new_blocks loaded from include file 13271 1727203826.00205: in VariableManager get_vars() 13271 1727203826.00217: done with get_vars() 13271 1727203826.00218: filtering new block on tags 13271 1727203826.00227: done filtering new block on tags 13271 1727203826.00229: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 13271 1727203826.00232: extending task lists for all hosts with included blocks 13271 1727203826.00293: done extending task lists 13271 1727203826.00294: done processing included files 13271 1727203826.00295: results queue empty 13271 1727203826.00295: checking for any_errors_fatal 13271 1727203826.00297: done checking for any_errors_fatal 13271 1727203826.00298: checking for max_fail_percentage 13271 1727203826.00298: done checking for max_fail_percentage 13271 1727203826.00299: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.00299: done checking to see if all hosts have failed 13271 1727203826.00300: getting the remaining hosts for this loop 13271 1727203826.00300: done getting the remaining hosts for this loop 13271 1727203826.00302: getting the next task for host managed-node1 13271 1727203826.00305: done getting next task for host managed-node1 13271 1727203826.00306: ^ task is: TASK: Get stat for interface {{ interface }} 13271 1727203826.00308: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.00309: getting variables 13271 1727203826.00310: in VariableManager get_vars() 13271 1727203826.00318: Calling all_inventory to load vars for managed-node1 13271 1727203826.00319: Calling groups_inventory to load vars for managed-node1 13271 1727203826.00321: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.00326: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.00327: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.00329: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.00409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.00517: done with get_vars() 13271 1727203826.00522: done getting variables 13271 1727203826.00625: variable 'interface' from source: task vars 13271 1727203826.00628: variable 'dhcp_interface2' from source: play vars 13271 1727203826.00666: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.029) 0:00:09.650 ***** 13271 1727203826.00691: entering _queue_task() for managed-node1/stat 13271 1727203826.00882: worker is 1 (out of 1 available) 13271 1727203826.00894: exiting _queue_task() for managed-node1/stat 13271 1727203826.00904: done queuing things up, now waiting for results queue to drain 13271 1727203826.00906: waiting for pending results... 13271 1727203826.01294: running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 13271 1727203826.01300: in run() - task 028d2410-947f-2a40-12ba-00000000016a 13271 1727203826.01302: variable 'ansible_search_path' from source: unknown 13271 1727203826.01305: variable 'ansible_search_path' from source: unknown 13271 1727203826.01307: calling self._execute() 13271 1727203826.01341: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.01351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.01367: variable 'omit' from source: magic vars 13271 1727203826.01694: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.01707: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.01716: variable 'omit' from source: magic vars 13271 1727203826.01770: variable 'omit' from source: magic vars 13271 1727203826.01865: variable 'interface' from source: task vars 13271 1727203826.01874: variable 'dhcp_interface2' from source: play vars 13271 1727203826.01931: variable 'dhcp_interface2' from source: play vars 13271 1727203826.01954: variable 'omit' from source: magic vars 13271 1727203826.02001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203826.02039: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203826.02066: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203826.02089: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.02103: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.02135: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203826.02143: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.02150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.02246: Set connection var ansible_connection to ssh 13271 1727203826.02259: Set connection var ansible_shell_type to sh 13271 1727203826.02274: Set connection var ansible_timeout to 10 13271 1727203826.02285: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203826.02293: Set connection var ansible_pipelining to False 13271 1727203826.02301: Set connection var ansible_shell_executable to /bin/sh 13271 1727203826.02394: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.02449: variable 'ansible_connection' from source: unknown 13271 1727203826.02456: variable 'ansible_module_compression' from source: unknown 13271 1727203826.02462: variable 'ansible_shell_type' from source: unknown 13271 1727203826.02469: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.02478: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.02496: variable 'ansible_pipelining' from source: unknown 13271 1727203826.02503: variable 'ansible_timeout' from source: unknown 13271 1727203826.02511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.02738: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203826.02753: variable 'omit' from source: magic vars 13271 1727203826.02763: starting attempt loop 13271 1727203826.02770: running the handler 13271 1727203826.02824: _low_level_execute_command(): starting 13271 1727203826.02827: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203826.04301: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.04361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203826.04365: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203826.04393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.04686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.06456: stdout chunk (state=3): >>>/root <<< 13271 1727203826.06599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203826.06623: stderr chunk (state=3): >>><<< 13271 1727203826.06647: stdout chunk (state=3): >>><<< 13271 1727203826.06686: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203826.06708: _low_level_execute_command(): starting 13271 1727203826.06721: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682 `" && echo ansible-tmp-1727203826.066937-13974-89300233262682="` echo /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682 `" ) && sleep 0' 13271 1727203826.07442: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203826.07496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203826.07541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.07621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203826.07654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.07796: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.09899: stdout chunk (state=3): >>>ansible-tmp-1727203826.066937-13974-89300233262682=/root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682 <<< 13271 1727203826.10002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203826.10038: stderr chunk (state=3): >>><<< 13271 1727203826.10041: stdout chunk (state=3): >>><<< 13271 1727203826.10056: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203826.066937-13974-89300233262682=/root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203826.10135: variable 'ansible_module_compression' from source: unknown 13271 1727203826.10381: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13271 1727203826.10384: variable 'ansible_facts' from source: unknown 13271 1727203826.10390: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/AnsiballZ_stat.py 13271 1727203826.10520: Sending initial data 13271 1727203826.10523: Sent initial data (151 bytes) 13271 1727203826.11083: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203826.11086: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203826.11099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203826.11113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203826.11125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203826.11132: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203826.11142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.11166: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203826.11194: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.11272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203826.11293: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.11392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.13140: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203826.13231: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203826.13304: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpk75lr159 /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/AnsiballZ_stat.py <<< 13271 1727203826.13340: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/AnsiballZ_stat.py" <<< 13271 1727203826.13407: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpk75lr159" to remote "/root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/AnsiballZ_stat.py" <<< 13271 1727203826.14333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203826.14340: stderr chunk (state=3): >>><<< 13271 1727203826.14343: stdout chunk (state=3): >>><<< 13271 1727203826.14345: done transferring module to remote 13271 1727203826.14373: _low_level_execute_command(): starting 13271 1727203826.14378: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/ /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/AnsiballZ_stat.py && sleep 0' 13271 1727203826.14803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203826.14807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.14810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203826.14812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.14862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203826.14873: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.14952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.17045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203826.17049: stdout chunk (state=3): >>><<< 13271 1727203826.17051: stderr chunk (state=3): >>><<< 13271 1727203826.17054: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203826.17056: _low_level_execute_command(): starting 13271 1727203826.17058: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/AnsiballZ_stat.py && sleep 0' 13271 1727203826.17538: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203826.17541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.17543: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 13271 1727203826.17546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203826.17548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.17601: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203826.17607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.17696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.34077: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26897, "dev": 23, "nlink": 1, "atime": 1727203824.105809, "mtime": 1727203824.105809, "ctime": 1727203824.105809, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13271 1727203826.35734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203826.35738: stdout chunk (state=3): >>><<< 13271 1727203826.35740: stderr chunk (state=3): >>><<< 13271 1727203826.35938: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26897, "dev": 23, "nlink": 1, "atime": 1727203824.105809, "mtime": 1727203824.105809, "ctime": 1727203824.105809, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203826.35943: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203826.35946: _low_level_execute_command(): starting 13271 1727203826.35948: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203826.066937-13974-89300233262682/ > /dev/null 2>&1 && sleep 0' 13271 1727203826.36372: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203826.36395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203826.36406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.36451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203826.36472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.36538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.38899: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203826.38902: stdout chunk (state=3): >>><<< 13271 1727203826.38904: stderr chunk (state=3): >>><<< 13271 1727203826.38907: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203826.38909: handler run complete 13271 1727203826.38911: attempt loop complete, returning result 13271 1727203826.38913: _execute() done 13271 1727203826.38915: dumping result to json 13271 1727203826.38917: done dumping result, returning 13271 1727203826.38919: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 [028d2410-947f-2a40-12ba-00000000016a] 13271 1727203826.38920: sending task result for task 028d2410-947f-2a40-12ba-00000000016a ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727203824.105809, "block_size": 4096, "blocks": 0, "ctime": 1727203824.105809, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26897, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727203824.105809, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13271 1727203826.39244: no more pending results, returning what we have 13271 1727203826.39248: results queue empty 13271 1727203826.39249: checking for any_errors_fatal 13271 1727203826.39250: done checking for any_errors_fatal 13271 1727203826.39251: checking for max_fail_percentage 13271 1727203826.39253: done checking for max_fail_percentage 13271 1727203826.39254: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.39255: done checking to see if all hosts have failed 13271 1727203826.39255: getting the remaining hosts for this loop 13271 1727203826.39257: done getting the remaining hosts for this loop 13271 1727203826.39260: getting the next task for host managed-node1 13271 1727203826.39269: done getting next task for host managed-node1 13271 1727203826.39272: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13271 1727203826.39277: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.39282: getting variables 13271 1727203826.39283: in VariableManager get_vars() 13271 1727203826.39322: Calling all_inventory to load vars for managed-node1 13271 1727203826.39662: Calling groups_inventory to load vars for managed-node1 13271 1727203826.39665: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.39677: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.39680: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.39683: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.40182: done sending task result for task 028d2410-947f-2a40-12ba-00000000016a 13271 1727203826.40185: WORKER PROCESS EXITING 13271 1727203826.40213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.40417: done with get_vars() 13271 1727203826.40428: done getting variables 13271 1727203826.40492: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203826.40624: variable 'interface' from source: task vars 13271 1727203826.40628: variable 'dhcp_interface2' from source: play vars 13271 1727203826.40694: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.400) 0:00:10.050 ***** 13271 1727203826.40725: entering _queue_task() for managed-node1/assert 13271 1727203826.41010: worker is 1 (out of 1 available) 13271 1727203826.41022: exiting _queue_task() for managed-node1/assert 13271 1727203826.41033: done queuing things up, now waiting for results queue to drain 13271 1727203826.41035: waiting for pending results... 13271 1727203826.41298: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' 13271 1727203826.41393: in run() - task 028d2410-947f-2a40-12ba-00000000001c 13271 1727203826.41412: variable 'ansible_search_path' from source: unknown 13271 1727203826.41420: variable 'ansible_search_path' from source: unknown 13271 1727203826.41482: calling self._execute() 13271 1727203826.41545: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.41556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.41571: variable 'omit' from source: magic vars 13271 1727203826.41935: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.41939: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.41941: variable 'omit' from source: magic vars 13271 1727203826.41990: variable 'omit' from source: magic vars 13271 1727203826.42281: variable 'interface' from source: task vars 13271 1727203826.42284: variable 'dhcp_interface2' from source: play vars 13271 1727203826.42286: variable 'dhcp_interface2' from source: play vars 13271 1727203826.42289: variable 'omit' from source: magic vars 13271 1727203826.42291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203826.42293: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203826.42295: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203826.42297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.42309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.42341: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203826.42350: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.42356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.42459: Set connection var ansible_connection to ssh 13271 1727203826.42475: Set connection var ansible_shell_type to sh 13271 1727203826.42489: Set connection var ansible_timeout to 10 13271 1727203826.42498: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203826.42508: Set connection var ansible_pipelining to False 13271 1727203826.42521: Set connection var ansible_shell_executable to /bin/sh 13271 1727203826.42546: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.42554: variable 'ansible_connection' from source: unknown 13271 1727203826.42560: variable 'ansible_module_compression' from source: unknown 13271 1727203826.42576: variable 'ansible_shell_type' from source: unknown 13271 1727203826.42584: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.42590: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.42597: variable 'ansible_pipelining' from source: unknown 13271 1727203826.42603: variable 'ansible_timeout' from source: unknown 13271 1727203826.42609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.42757: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203826.42775: variable 'omit' from source: magic vars 13271 1727203826.42787: starting attempt loop 13271 1727203826.42793: running the handler 13271 1727203826.42915: variable 'interface_stat' from source: set_fact 13271 1727203826.42952: Evaluated conditional (interface_stat.stat.exists): True 13271 1727203826.42955: handler run complete 13271 1727203826.42969: attempt loop complete, returning result 13271 1727203826.43060: _execute() done 13271 1727203826.43065: dumping result to json 13271 1727203826.43067: done dumping result, returning 13271 1727203826.43069: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' [028d2410-947f-2a40-12ba-00000000001c] 13271 1727203826.43071: sending task result for task 028d2410-947f-2a40-12ba-00000000001c 13271 1727203826.43135: done sending task result for task 028d2410-947f-2a40-12ba-00000000001c 13271 1727203826.43138: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203826.43212: no more pending results, returning what we have 13271 1727203826.43216: results queue empty 13271 1727203826.43217: checking for any_errors_fatal 13271 1727203826.43225: done checking for any_errors_fatal 13271 1727203826.43226: checking for max_fail_percentage 13271 1727203826.43228: done checking for max_fail_percentage 13271 1727203826.43229: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.43230: done checking to see if all hosts have failed 13271 1727203826.43231: getting the remaining hosts for this loop 13271 1727203826.43232: done getting the remaining hosts for this loop 13271 1727203826.43235: getting the next task for host managed-node1 13271 1727203826.43243: done getting next task for host managed-node1 13271 1727203826.43246: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 13271 1727203826.43248: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.43251: getting variables 13271 1727203826.43253: in VariableManager get_vars() 13271 1727203826.43295: Calling all_inventory to load vars for managed-node1 13271 1727203826.43298: Calling groups_inventory to load vars for managed-node1 13271 1727203826.43300: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.43311: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.43314: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.43317: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.43586: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.43868: done with get_vars() 13271 1727203826.43881: done getting variables 13271 1727203826.43929: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.032) 0:00:10.082 ***** 13271 1727203826.43955: entering _queue_task() for managed-node1/command 13271 1727203826.44191: worker is 1 (out of 1 available) 13271 1727203826.44204: exiting _queue_task() for managed-node1/command 13271 1727203826.44218: done queuing things up, now waiting for results queue to drain 13271 1727203826.44219: waiting for pending results... 13271 1727203826.44713: running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript 13271 1727203826.44816: in run() - task 028d2410-947f-2a40-12ba-00000000001d 13271 1727203826.45081: variable 'ansible_search_path' from source: unknown 13271 1727203826.45087: calling self._execute() 13271 1727203826.45200: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.45229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.45263: variable 'omit' from source: magic vars 13271 1727203826.45890: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.45907: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.46026: variable 'network_provider' from source: set_fact 13271 1727203826.46079: Evaluated conditional (network_provider == "initscripts"): False 13271 1727203826.46082: when evaluation is False, skipping this task 13271 1727203826.46085: _execute() done 13271 1727203826.46087: dumping result to json 13271 1727203826.46089: done dumping result, returning 13271 1727203826.46092: done running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript [028d2410-947f-2a40-12ba-00000000001d] 13271 1727203826.46094: sending task result for task 028d2410-947f-2a40-12ba-00000000001d skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13271 1727203826.46224: no more pending results, returning what we have 13271 1727203826.46227: results queue empty 13271 1727203826.46228: checking for any_errors_fatal 13271 1727203826.46232: done checking for any_errors_fatal 13271 1727203826.46233: checking for max_fail_percentage 13271 1727203826.46234: done checking for max_fail_percentage 13271 1727203826.46235: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.46236: done checking to see if all hosts have failed 13271 1727203826.46237: getting the remaining hosts for this loop 13271 1727203826.46238: done getting the remaining hosts for this loop 13271 1727203826.46242: getting the next task for host managed-node1 13271 1727203826.46249: done getting next task for host managed-node1 13271 1727203826.46251: ^ task is: TASK: TEST Add Bond with 2 ports 13271 1727203826.46255: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.46258: getting variables 13271 1727203826.46263: in VariableManager get_vars() 13271 1727203826.46307: Calling all_inventory to load vars for managed-node1 13271 1727203826.46310: Calling groups_inventory to load vars for managed-node1 13271 1727203826.46313: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.46326: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.46330: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.46333: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.46672: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.47014: done with get_vars() 13271 1727203826.47024: done getting variables 13271 1727203826.47058: done sending task result for task 028d2410-947f-2a40-12ba-00000000001d 13271 1727203826.47063: WORKER PROCESS EXITING 13271 1727203826.47099: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.031) 0:00:10.114 ***** 13271 1727203826.47131: entering _queue_task() for managed-node1/debug 13271 1727203826.47378: worker is 1 (out of 1 available) 13271 1727203826.47389: exiting _queue_task() for managed-node1/debug 13271 1727203826.47399: done queuing things up, now waiting for results queue to drain 13271 1727203826.47401: waiting for pending results... 13271 1727203826.47656: running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports 13271 1727203826.47751: in run() - task 028d2410-947f-2a40-12ba-00000000001e 13271 1727203826.47779: variable 'ansible_search_path' from source: unknown 13271 1727203826.47817: calling self._execute() 13271 1727203826.47904: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.47916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.47929: variable 'omit' from source: magic vars 13271 1727203826.48302: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.48323: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.48333: variable 'omit' from source: magic vars 13271 1727203826.48359: variable 'omit' from source: magic vars 13271 1727203826.48402: variable 'omit' from source: magic vars 13271 1727203826.48448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203826.48493: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203826.48518: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203826.48544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.48564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.48598: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203826.48607: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.48613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.48711: Set connection var ansible_connection to ssh 13271 1727203826.48723: Set connection var ansible_shell_type to sh 13271 1727203826.48743: Set connection var ansible_timeout to 10 13271 1727203826.48780: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203826.48783: Set connection var ansible_pipelining to False 13271 1727203826.48785: Set connection var ansible_shell_executable to /bin/sh 13271 1727203826.48789: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.48795: variable 'ansible_connection' from source: unknown 13271 1727203826.48801: variable 'ansible_module_compression' from source: unknown 13271 1727203826.48806: variable 'ansible_shell_type' from source: unknown 13271 1727203826.48811: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.49182: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.49185: variable 'ansible_pipelining' from source: unknown 13271 1727203826.49189: variable 'ansible_timeout' from source: unknown 13271 1727203826.49191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.49195: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203826.49198: variable 'omit' from source: magic vars 13271 1727203826.49201: starting attempt loop 13271 1727203826.49203: running the handler 13271 1727203826.49312: handler run complete 13271 1727203826.49365: attempt loop complete, returning result 13271 1727203826.49374: _execute() done 13271 1727203826.49384: dumping result to json 13271 1727203826.49398: done dumping result, returning 13271 1727203826.49409: done running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports [028d2410-947f-2a40-12ba-00000000001e] 13271 1727203826.49418: sending task result for task 028d2410-947f-2a40-12ba-00000000001e ok: [managed-node1] => {} MSG: ################################################## 13271 1727203826.49569: no more pending results, returning what we have 13271 1727203826.49573: results queue empty 13271 1727203826.49573: checking for any_errors_fatal 13271 1727203826.49580: done checking for any_errors_fatal 13271 1727203826.49581: checking for max_fail_percentage 13271 1727203826.49582: done checking for max_fail_percentage 13271 1727203826.49584: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.49585: done checking to see if all hosts have failed 13271 1727203826.49585: getting the remaining hosts for this loop 13271 1727203826.49587: done getting the remaining hosts for this loop 13271 1727203826.49590: getting the next task for host managed-node1 13271 1727203826.49599: done getting next task for host managed-node1 13271 1727203826.49605: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13271 1727203826.49608: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.49625: getting variables 13271 1727203826.49628: in VariableManager get_vars() 13271 1727203826.49670: Calling all_inventory to load vars for managed-node1 13271 1727203826.49673: Calling groups_inventory to load vars for managed-node1 13271 1727203826.49780: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.49792: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.49795: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.49799: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.50140: done sending task result for task 028d2410-947f-2a40-12ba-00000000001e 13271 1727203826.50143: WORKER PROCESS EXITING 13271 1727203826.50151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.50382: done with get_vars() 13271 1727203826.50393: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.033) 0:00:10.148 ***** 13271 1727203826.50486: entering _queue_task() for managed-node1/include_tasks 13271 1727203826.50711: worker is 1 (out of 1 available) 13271 1727203826.50723: exiting _queue_task() for managed-node1/include_tasks 13271 1727203826.50734: done queuing things up, now waiting for results queue to drain 13271 1727203826.50736: waiting for pending results... 13271 1727203826.50985: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13271 1727203826.51120: in run() - task 028d2410-947f-2a40-12ba-000000000026 13271 1727203826.51140: variable 'ansible_search_path' from source: unknown 13271 1727203826.51148: variable 'ansible_search_path' from source: unknown 13271 1727203826.51192: calling self._execute() 13271 1727203826.51277: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.51289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.51301: variable 'omit' from source: magic vars 13271 1727203826.51654: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.51673: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.51686: _execute() done 13271 1727203826.51694: dumping result to json 13271 1727203826.51701: done dumping result, returning 13271 1727203826.51712: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-2a40-12ba-000000000026] 13271 1727203826.51720: sending task result for task 028d2410-947f-2a40-12ba-000000000026 13271 1727203826.51843: no more pending results, returning what we have 13271 1727203826.51848: in VariableManager get_vars() 13271 1727203826.51898: Calling all_inventory to load vars for managed-node1 13271 1727203826.51900: Calling groups_inventory to load vars for managed-node1 13271 1727203826.51903: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.51914: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.51918: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.51921: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.52213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.52507: done with get_vars() 13271 1727203826.52514: variable 'ansible_search_path' from source: unknown 13271 1727203826.52515: variable 'ansible_search_path' from source: unknown 13271 1727203826.52528: done sending task result for task 028d2410-947f-2a40-12ba-000000000026 13271 1727203826.52531: WORKER PROCESS EXITING 13271 1727203826.52560: we have included files to process 13271 1727203826.52564: generating all_blocks data 13271 1727203826.52566: done generating all_blocks data 13271 1727203826.52571: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13271 1727203826.52572: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13271 1727203826.52574: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13271 1727203826.53258: done processing included file 13271 1727203826.53263: iterating over new_blocks loaded from include file 13271 1727203826.53264: in VariableManager get_vars() 13271 1727203826.53290: done with get_vars() 13271 1727203826.53292: filtering new block on tags 13271 1727203826.53309: done filtering new block on tags 13271 1727203826.53312: in VariableManager get_vars() 13271 1727203826.53334: done with get_vars() 13271 1727203826.53336: filtering new block on tags 13271 1727203826.53355: done filtering new block on tags 13271 1727203826.53357: in VariableManager get_vars() 13271 1727203826.53382: done with get_vars() 13271 1727203826.53384: filtering new block on tags 13271 1727203826.53402: done filtering new block on tags 13271 1727203826.53404: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 13271 1727203826.53409: extending task lists for all hosts with included blocks 13271 1727203826.54236: done extending task lists 13271 1727203826.54237: done processing included files 13271 1727203826.54238: results queue empty 13271 1727203826.54239: checking for any_errors_fatal 13271 1727203826.54242: done checking for any_errors_fatal 13271 1727203826.54242: checking for max_fail_percentage 13271 1727203826.54243: done checking for max_fail_percentage 13271 1727203826.54244: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.54245: done checking to see if all hosts have failed 13271 1727203826.54246: getting the remaining hosts for this loop 13271 1727203826.54247: done getting the remaining hosts for this loop 13271 1727203826.54250: getting the next task for host managed-node1 13271 1727203826.54255: done getting next task for host managed-node1 13271 1727203826.54257: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13271 1727203826.54263: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.54272: getting variables 13271 1727203826.54273: in VariableManager get_vars() 13271 1727203826.54290: Calling all_inventory to load vars for managed-node1 13271 1727203826.54292: Calling groups_inventory to load vars for managed-node1 13271 1727203826.54294: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.54299: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.54302: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.54304: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.54465: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.54650: done with get_vars() 13271 1727203826.54660: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.042) 0:00:10.190 ***** 13271 1727203826.54726: entering _queue_task() for managed-node1/setup 13271 1727203826.54982: worker is 1 (out of 1 available) 13271 1727203826.54993: exiting _queue_task() for managed-node1/setup 13271 1727203826.55005: done queuing things up, now waiting for results queue to drain 13271 1727203826.55007: waiting for pending results... 13271 1727203826.55269: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13271 1727203826.55408: in run() - task 028d2410-947f-2a40-12ba-000000000188 13271 1727203826.55429: variable 'ansible_search_path' from source: unknown 13271 1727203826.55437: variable 'ansible_search_path' from source: unknown 13271 1727203826.55483: calling self._execute() 13271 1727203826.55565: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.55579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.55592: variable 'omit' from source: magic vars 13271 1727203826.55922: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.55939: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.56491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203826.60244: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203826.60322: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203826.60366: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203826.60411: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203826.60441: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203826.60530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203826.60569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203826.60602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203826.60651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203826.60675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203826.60736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203826.60767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203826.60799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203826.60845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203826.60869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203826.61020: variable '__network_required_facts' from source: role '' defaults 13271 1727203826.61035: variable 'ansible_facts' from source: unknown 13271 1727203826.61380: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13271 1727203826.61385: when evaluation is False, skipping this task 13271 1727203826.61388: _execute() done 13271 1727203826.61390: dumping result to json 13271 1727203826.61392: done dumping result, returning 13271 1727203826.61395: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-2a40-12ba-000000000188] 13271 1727203826.61397: sending task result for task 028d2410-947f-2a40-12ba-000000000188 13271 1727203826.61459: done sending task result for task 028d2410-947f-2a40-12ba-000000000188 13271 1727203826.61464: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203826.61512: no more pending results, returning what we have 13271 1727203826.61515: results queue empty 13271 1727203826.61516: checking for any_errors_fatal 13271 1727203826.61518: done checking for any_errors_fatal 13271 1727203826.61518: checking for max_fail_percentage 13271 1727203826.61520: done checking for max_fail_percentage 13271 1727203826.61521: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.61522: done checking to see if all hosts have failed 13271 1727203826.61523: getting the remaining hosts for this loop 13271 1727203826.61524: done getting the remaining hosts for this loop 13271 1727203826.61528: getting the next task for host managed-node1 13271 1727203826.61537: done getting next task for host managed-node1 13271 1727203826.61541: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13271 1727203826.61545: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.61559: getting variables 13271 1727203826.61564: in VariableManager get_vars() 13271 1727203826.61608: Calling all_inventory to load vars for managed-node1 13271 1727203826.61611: Calling groups_inventory to load vars for managed-node1 13271 1727203826.61613: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.61625: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.61627: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.61630: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.62119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.62620: done with get_vars() 13271 1727203826.62634: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.081) 0:00:10.272 ***** 13271 1727203826.62903: entering _queue_task() for managed-node1/stat 13271 1727203826.63571: worker is 1 (out of 1 available) 13271 1727203826.63587: exiting _queue_task() for managed-node1/stat 13271 1727203826.63715: done queuing things up, now waiting for results queue to drain 13271 1727203826.63718: waiting for pending results... 13271 1727203826.64515: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 13271 1727203826.64519: in run() - task 028d2410-947f-2a40-12ba-00000000018a 13271 1727203826.64522: variable 'ansible_search_path' from source: unknown 13271 1727203826.64524: variable 'ansible_search_path' from source: unknown 13271 1727203826.64526: calling self._execute() 13271 1727203826.64626: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.64785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.64798: variable 'omit' from source: magic vars 13271 1727203826.65699: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.65881: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.66363: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203826.67101: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203826.67209: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203826.67245: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203826.67311: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203826.67426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203826.67458: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203826.67495: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203826.67525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203826.67780: variable '__network_is_ostree' from source: set_fact 13271 1727203826.67793: Evaluated conditional (not __network_is_ostree is defined): False 13271 1727203826.67980: when evaluation is False, skipping this task 13271 1727203826.67983: _execute() done 13271 1727203826.67985: dumping result to json 13271 1727203826.67988: done dumping result, returning 13271 1727203826.67990: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-2a40-12ba-00000000018a] 13271 1727203826.67993: sending task result for task 028d2410-947f-2a40-12ba-00000000018a 13271 1727203826.68059: done sending task result for task 028d2410-947f-2a40-12ba-00000000018a 13271 1727203826.68065: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13271 1727203826.68135: no more pending results, returning what we have 13271 1727203826.68139: results queue empty 13271 1727203826.68140: checking for any_errors_fatal 13271 1727203826.68145: done checking for any_errors_fatal 13271 1727203826.68146: checking for max_fail_percentage 13271 1727203826.68148: done checking for max_fail_percentage 13271 1727203826.68149: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.68150: done checking to see if all hosts have failed 13271 1727203826.68151: getting the remaining hosts for this loop 13271 1727203826.68152: done getting the remaining hosts for this loop 13271 1727203826.68156: getting the next task for host managed-node1 13271 1727203826.68162: done getting next task for host managed-node1 13271 1727203826.68172: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13271 1727203826.68179: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.68194: getting variables 13271 1727203826.68196: in VariableManager get_vars() 13271 1727203826.68236: Calling all_inventory to load vars for managed-node1 13271 1727203826.68240: Calling groups_inventory to load vars for managed-node1 13271 1727203826.68242: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.68255: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.68258: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.68261: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.69005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.70042: done with get_vars() 13271 1727203826.70054: done getting variables 13271 1727203826.70426: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.075) 0:00:10.347 ***** 13271 1727203826.70464: entering _queue_task() for managed-node1/set_fact 13271 1727203826.71594: worker is 1 (out of 1 available) 13271 1727203826.71604: exiting _queue_task() for managed-node1/set_fact 13271 1727203826.71841: done queuing things up, now waiting for results queue to drain 13271 1727203826.71843: waiting for pending results... 13271 1727203826.72291: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13271 1727203826.72329: in run() - task 028d2410-947f-2a40-12ba-00000000018b 13271 1727203826.72333: variable 'ansible_search_path' from source: unknown 13271 1727203826.72335: variable 'ansible_search_path' from source: unknown 13271 1727203826.72337: calling self._execute() 13271 1727203826.72782: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.72786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.72788: variable 'omit' from source: magic vars 13271 1727203826.73069: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.73188: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.73358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203826.73890: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203826.73936: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203826.74116: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203826.74154: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203826.74253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203826.74581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203826.74613: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203826.74712: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203826.74982: variable '__network_is_ostree' from source: set_fact 13271 1727203826.74985: Evaluated conditional (not __network_is_ostree is defined): False 13271 1727203826.74988: when evaluation is False, skipping this task 13271 1727203826.74996: _execute() done 13271 1727203826.75003: dumping result to json 13271 1727203826.75010: done dumping result, returning 13271 1727203826.75231: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-2a40-12ba-00000000018b] 13271 1727203826.75242: sending task result for task 028d2410-947f-2a40-12ba-00000000018b skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13271 1727203826.75395: no more pending results, returning what we have 13271 1727203826.75399: results queue empty 13271 1727203826.75400: checking for any_errors_fatal 13271 1727203826.75405: done checking for any_errors_fatal 13271 1727203826.75406: checking for max_fail_percentage 13271 1727203826.75408: done checking for max_fail_percentage 13271 1727203826.75409: checking to see if all hosts have failed and the running result is not ok 13271 1727203826.75410: done checking to see if all hosts have failed 13271 1727203826.75411: getting the remaining hosts for this loop 13271 1727203826.75412: done getting the remaining hosts for this loop 13271 1727203826.75415: getting the next task for host managed-node1 13271 1727203826.75425: done getting next task for host managed-node1 13271 1727203826.75428: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13271 1727203826.75432: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203826.75445: getting variables 13271 1727203826.75447: in VariableManager get_vars() 13271 1727203826.75489: Calling all_inventory to load vars for managed-node1 13271 1727203826.75492: Calling groups_inventory to load vars for managed-node1 13271 1727203826.75494: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203826.75508: Calling all_plugins_play to load vars for managed-node1 13271 1727203826.75511: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203826.75515: Calling groups_plugins_play to load vars for managed-node1 13271 1727203826.76084: done sending task result for task 028d2410-947f-2a40-12ba-00000000018b 13271 1727203826.76087: WORKER PROCESS EXITING 13271 1727203826.76245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203826.77095: done with get_vars() 13271 1727203826.77108: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:50:26 -0400 (0:00:00.068) 0:00:10.416 ***** 13271 1727203826.77320: entering _queue_task() for managed-node1/service_facts 13271 1727203826.77322: Creating lock for service_facts 13271 1727203826.78244: worker is 1 (out of 1 available) 13271 1727203826.78255: exiting _queue_task() for managed-node1/service_facts 13271 1727203826.78269: done queuing things up, now waiting for results queue to drain 13271 1727203826.78271: waiting for pending results... 13271 1727203826.78798: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 13271 1727203826.78803: in run() - task 028d2410-947f-2a40-12ba-00000000018d 13271 1727203826.79381: variable 'ansible_search_path' from source: unknown 13271 1727203826.79384: variable 'ansible_search_path' from source: unknown 13271 1727203826.79387: calling self._execute() 13271 1727203826.79390: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.79392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.79486: variable 'omit' from source: magic vars 13271 1727203826.79913: variable 'ansible_distribution_major_version' from source: facts 13271 1727203826.79930: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203826.79942: variable 'omit' from source: magic vars 13271 1727203826.80006: variable 'omit' from source: magic vars 13271 1727203826.80038: variable 'omit' from source: magic vars 13271 1727203826.80083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203826.80121: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203826.80146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203826.80167: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.80185: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203826.80310: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203826.80320: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.80328: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.80726: Set connection var ansible_connection to ssh 13271 1727203826.80741: Set connection var ansible_shell_type to sh 13271 1727203826.80754: Set connection var ansible_timeout to 10 13271 1727203826.81081: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203826.81085: Set connection var ansible_pipelining to False 13271 1727203826.81089: Set connection var ansible_shell_executable to /bin/sh 13271 1727203826.81280: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.81284: variable 'ansible_connection' from source: unknown 13271 1727203826.81286: variable 'ansible_module_compression' from source: unknown 13271 1727203826.81289: variable 'ansible_shell_type' from source: unknown 13271 1727203826.81291: variable 'ansible_shell_executable' from source: unknown 13271 1727203826.81292: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203826.81294: variable 'ansible_pipelining' from source: unknown 13271 1727203826.81297: variable 'ansible_timeout' from source: unknown 13271 1727203826.81299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203826.81508: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203826.81546: variable 'omit' from source: magic vars 13271 1727203826.81557: starting attempt loop 13271 1727203826.81563: running the handler 13271 1727203826.81584: _low_level_execute_command(): starting 13271 1727203826.81597: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203826.82399: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.82434: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203826.82460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203826.82478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.82611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.84426: stdout chunk (state=3): >>>/root <<< 13271 1727203826.84555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203826.84566: stdout chunk (state=3): >>><<< 13271 1727203826.84581: stderr chunk (state=3): >>><<< 13271 1727203826.84604: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203826.84873: _low_level_execute_command(): starting 13271 1727203826.84880: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832 `" && echo ansible-tmp-1727203826.8478363-14072-232958117737832="` echo /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832 `" ) && sleep 0' 13271 1727203826.85968: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203826.86012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203826.86052: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203826.86193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203826.86291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203826.88984: stdout chunk (state=3): >>>ansible-tmp-1727203826.8478363-14072-232958117737832=/root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832 <<< 13271 1727203826.88989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203826.88992: stdout chunk (state=3): >>><<< 13271 1727203826.88995: stderr chunk (state=3): >>><<< 13271 1727203826.88998: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203826.8478363-14072-232958117737832=/root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203826.89001: variable 'ansible_module_compression' from source: unknown 13271 1727203826.89004: ANSIBALLZ: Using lock for service_facts 13271 1727203826.89006: ANSIBALLZ: Acquiring lock 13271 1727203826.89009: ANSIBALLZ: Lock acquired: 140497828465072 13271 1727203826.89012: ANSIBALLZ: Creating module 13271 1727203827.06872: ANSIBALLZ: Writing module into payload 13271 1727203827.06981: ANSIBALLZ: Writing module 13271 1727203827.07013: ANSIBALLZ: Renaming module 13271 1727203827.07024: ANSIBALLZ: Done creating module 13271 1727203827.07048: variable 'ansible_facts' from source: unknown 13271 1727203827.07133: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/AnsiballZ_service_facts.py 13271 1727203827.07304: Sending initial data 13271 1727203827.07307: Sent initial data (162 bytes) 13271 1727203827.07953: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203827.08120: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203827.08291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203827.10090: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203827.10145: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203827.10215: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp3llt087q /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/AnsiballZ_service_facts.py <<< 13271 1727203827.10218: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/AnsiballZ_service_facts.py" <<< 13271 1727203827.10479: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp3llt087q" to remote "/root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/AnsiballZ_service_facts.py" <<< 13271 1727203827.12150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203827.12153: stdout chunk (state=3): >>><<< 13271 1727203827.12155: stderr chunk (state=3): >>><<< 13271 1727203827.12158: done transferring module to remote 13271 1727203827.12161: _low_level_execute_command(): starting 13271 1727203827.12166: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/ /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/AnsiballZ_service_facts.py && sleep 0' 13271 1727203827.13692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203827.14046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203827.14058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203827.14151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203827.16305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203827.16313: stderr chunk (state=3): >>><<< 13271 1727203827.16315: stdout chunk (state=3): >>><<< 13271 1727203827.16382: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203827.16385: _low_level_execute_command(): starting 13271 1727203827.16387: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/AnsiballZ_service_facts.py && sleep 0' 13271 1727203827.17005: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203827.17018: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203827.17035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203827.17054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203827.17093: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203827.17108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203827.17125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203827.17205: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203827.17227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203827.17346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203828.94353: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-ma<<< 13271 1727203828.94438: stdout chunk (state=3): >>>rk.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13271 1727203828.96099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203828.96126: stderr chunk (state=3): >>><<< 13271 1727203828.96131: stdout chunk (state=3): >>><<< 13271 1727203828.96150: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203828.97417: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203828.97425: _low_level_execute_command(): starting 13271 1727203828.97430: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203826.8478363-14072-232958117737832/ > /dev/null 2>&1 && sleep 0' 13271 1727203828.97870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203828.97881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203828.97898: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203828.97901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203828.97960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203828.97969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203828.97972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203828.98049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203829.00027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203829.00054: stderr chunk (state=3): >>><<< 13271 1727203829.00059: stdout chunk (state=3): >>><<< 13271 1727203829.00072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203829.00078: handler run complete 13271 1727203829.00189: variable 'ansible_facts' from source: unknown 13271 1727203829.00288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203829.00548: variable 'ansible_facts' from source: unknown 13271 1727203829.00627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203829.00742: attempt loop complete, returning result 13271 1727203829.00745: _execute() done 13271 1727203829.00749: dumping result to json 13271 1727203829.00791: done dumping result, returning 13271 1727203829.00799: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-2a40-12ba-00000000018d] 13271 1727203829.00804: sending task result for task 028d2410-947f-2a40-12ba-00000000018d ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203829.01380: no more pending results, returning what we have 13271 1727203829.01382: results queue empty 13271 1727203829.01383: checking for any_errors_fatal 13271 1727203829.01388: done checking for any_errors_fatal 13271 1727203829.01388: checking for max_fail_percentage 13271 1727203829.01390: done checking for max_fail_percentage 13271 1727203829.01391: checking to see if all hosts have failed and the running result is not ok 13271 1727203829.01391: done checking to see if all hosts have failed 13271 1727203829.01392: getting the remaining hosts for this loop 13271 1727203829.01393: done getting the remaining hosts for this loop 13271 1727203829.01396: getting the next task for host managed-node1 13271 1727203829.01400: done getting next task for host managed-node1 13271 1727203829.01403: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13271 1727203829.01407: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203829.01415: getting variables 13271 1727203829.01416: in VariableManager get_vars() 13271 1727203829.01441: Calling all_inventory to load vars for managed-node1 13271 1727203829.01442: Calling groups_inventory to load vars for managed-node1 13271 1727203829.01444: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203829.01451: Calling all_plugins_play to load vars for managed-node1 13271 1727203829.01453: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203829.01457: Calling groups_plugins_play to load vars for managed-node1 13271 1727203829.01789: done sending task result for task 028d2410-947f-2a40-12ba-00000000018d 13271 1727203829.01793: WORKER PROCESS EXITING 13271 1727203829.01803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203829.02079: done with get_vars() 13271 1727203829.02088: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:50:29 -0400 (0:00:02.248) 0:00:12.664 ***** 13271 1727203829.02156: entering _queue_task() for managed-node1/package_facts 13271 1727203829.02157: Creating lock for package_facts 13271 1727203829.02381: worker is 1 (out of 1 available) 13271 1727203829.02393: exiting _queue_task() for managed-node1/package_facts 13271 1727203829.02404: done queuing things up, now waiting for results queue to drain 13271 1727203829.02406: waiting for pending results... 13271 1727203829.02564: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 13271 1727203829.02645: in run() - task 028d2410-947f-2a40-12ba-00000000018e 13271 1727203829.02658: variable 'ansible_search_path' from source: unknown 13271 1727203829.02664: variable 'ansible_search_path' from source: unknown 13271 1727203829.02689: calling self._execute() 13271 1727203829.02749: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203829.02752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203829.02761: variable 'omit' from source: magic vars 13271 1727203829.03014: variable 'ansible_distribution_major_version' from source: facts 13271 1727203829.03023: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203829.03029: variable 'omit' from source: magic vars 13271 1727203829.03080: variable 'omit' from source: magic vars 13271 1727203829.03104: variable 'omit' from source: magic vars 13271 1727203829.03133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203829.03159: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203829.03185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203829.03196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203829.03206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203829.03229: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203829.03231: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203829.03234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203829.03302: Set connection var ansible_connection to ssh 13271 1727203829.03309: Set connection var ansible_shell_type to sh 13271 1727203829.03316: Set connection var ansible_timeout to 10 13271 1727203829.03321: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203829.03326: Set connection var ansible_pipelining to False 13271 1727203829.03331: Set connection var ansible_shell_executable to /bin/sh 13271 1727203829.03349: variable 'ansible_shell_executable' from source: unknown 13271 1727203829.03352: variable 'ansible_connection' from source: unknown 13271 1727203829.03354: variable 'ansible_module_compression' from source: unknown 13271 1727203829.03357: variable 'ansible_shell_type' from source: unknown 13271 1727203829.03359: variable 'ansible_shell_executable' from source: unknown 13271 1727203829.03364: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203829.03366: variable 'ansible_pipelining' from source: unknown 13271 1727203829.03369: variable 'ansible_timeout' from source: unknown 13271 1727203829.03371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203829.03514: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203829.03519: variable 'omit' from source: magic vars 13271 1727203829.03524: starting attempt loop 13271 1727203829.03527: running the handler 13271 1727203829.03538: _low_level_execute_command(): starting 13271 1727203829.03545: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203829.04057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203829.04060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.04068: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203829.04071: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.04123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203829.04126: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203829.04128: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203829.04214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203829.05972: stdout chunk (state=3): >>>/root <<< 13271 1727203829.06073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203829.06103: stderr chunk (state=3): >>><<< 13271 1727203829.06106: stdout chunk (state=3): >>><<< 13271 1727203829.06129: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203829.06139: _low_level_execute_command(): starting 13271 1727203829.06145: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047 `" && echo ansible-tmp-1727203829.0612793-14355-165402172521047="` echo /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047 `" ) && sleep 0' 13271 1727203829.06584: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203829.06588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203829.06599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.06651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203829.06660: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203829.06665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203829.06732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203829.08823: stdout chunk (state=3): >>>ansible-tmp-1727203829.0612793-14355-165402172521047=/root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047 <<< 13271 1727203829.08937: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203829.08963: stderr chunk (state=3): >>><<< 13271 1727203829.08967: stdout chunk (state=3): >>><<< 13271 1727203829.08984: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203829.0612793-14355-165402172521047=/root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203829.09023: variable 'ansible_module_compression' from source: unknown 13271 1727203829.09061: ANSIBALLZ: Using lock for package_facts 13271 1727203829.09065: ANSIBALLZ: Acquiring lock 13271 1727203829.09067: ANSIBALLZ: Lock acquired: 140497826919488 13271 1727203829.09072: ANSIBALLZ: Creating module 13271 1727203829.43289: ANSIBALLZ: Writing module into payload 13271 1727203829.43380: ANSIBALLZ: Writing module 13271 1727203829.43404: ANSIBALLZ: Renaming module 13271 1727203829.43409: ANSIBALLZ: Done creating module 13271 1727203829.43440: variable 'ansible_facts' from source: unknown 13271 1727203829.43564: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/AnsiballZ_package_facts.py 13271 1727203829.43677: Sending initial data 13271 1727203829.43681: Sent initial data (162 bytes) 13271 1727203829.44160: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203829.44229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.44232: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.44265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203829.44268: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203829.44270: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203829.44365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203829.46109: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203829.46186: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203829.46258: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmps4cwcvfh /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/AnsiballZ_package_facts.py <<< 13271 1727203829.46266: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/AnsiballZ_package_facts.py" <<< 13271 1727203829.46331: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmps4cwcvfh" to remote "/root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/AnsiballZ_package_facts.py" <<< 13271 1727203829.46335: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/AnsiballZ_package_facts.py" <<< 13271 1727203829.47659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203829.47839: stderr chunk (state=3): >>><<< 13271 1727203829.47842: stdout chunk (state=3): >>><<< 13271 1727203829.47844: done transferring module to remote 13271 1727203829.47846: _low_level_execute_command(): starting 13271 1727203829.47848: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/ /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/AnsiballZ_package_facts.py && sleep 0' 13271 1727203829.48388: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203829.48414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.48418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203829.48424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.48481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203829.48487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203829.48492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203829.48569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203829.50518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203829.50546: stderr chunk (state=3): >>><<< 13271 1727203829.50549: stdout chunk (state=3): >>><<< 13271 1727203829.50562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203829.50568: _low_level_execute_command(): starting 13271 1727203829.50580: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/AnsiballZ_package_facts.py && sleep 0' 13271 1727203829.51030: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203829.51034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.51036: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203829.51038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203829.51085: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203829.51092: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203829.51196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203829.98290: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 13271 1727203829.98305: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 13271 1727203829.98315: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10",<<< 13271 1727203829.98325: stdout chunk (state=3): >>> "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "<<< 13271 1727203829.98329: stdout chunk (state=3): >>>x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [<<< 13271 1727203829.98405: stdout chunk (state=3): >>>{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch<<< 13271 1727203829.98415: stdout chunk (state=3): >>>": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13271 1727203830.00502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203830.00681: stderr chunk (state=3): >>><<< 13271 1727203830.00685: stdout chunk (state=3): >>><<< 13271 1727203830.00795: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203830.03252: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203830.03285: _low_level_execute_command(): starting 13271 1727203830.03294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203829.0612793-14355-165402172521047/ > /dev/null 2>&1 && sleep 0' 13271 1727203830.03899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203830.03913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203830.03928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203830.03944: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203830.03959: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203830.04054: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203830.04111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203830.04192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203830.06255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203830.06260: stdout chunk (state=3): >>><<< 13271 1727203830.06268: stderr chunk (state=3): >>><<< 13271 1727203830.06303: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203830.06309: handler run complete 13271 1727203830.06893: variable 'ansible_facts' from source: unknown 13271 1727203830.07147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.08177: variable 'ansible_facts' from source: unknown 13271 1727203830.08595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.09263: attempt loop complete, returning result 13271 1727203830.09266: _execute() done 13271 1727203830.09268: dumping result to json 13271 1727203830.09430: done dumping result, returning 13271 1727203830.09452: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-2a40-12ba-00000000018e] 13271 1727203830.09456: sending task result for task 028d2410-947f-2a40-12ba-00000000018e 13271 1727203830.10683: done sending task result for task 028d2410-947f-2a40-12ba-00000000018e 13271 1727203830.10687: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203830.10729: no more pending results, returning what we have 13271 1727203830.10731: results queue empty 13271 1727203830.10731: checking for any_errors_fatal 13271 1727203830.10734: done checking for any_errors_fatal 13271 1727203830.10735: checking for max_fail_percentage 13271 1727203830.10736: done checking for max_fail_percentage 13271 1727203830.10736: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.10737: done checking to see if all hosts have failed 13271 1727203830.10737: getting the remaining hosts for this loop 13271 1727203830.10738: done getting the remaining hosts for this loop 13271 1727203830.10740: getting the next task for host managed-node1 13271 1727203830.10745: done getting next task for host managed-node1 13271 1727203830.10747: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13271 1727203830.10749: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.10754: getting variables 13271 1727203830.10755: in VariableManager get_vars() 13271 1727203830.10783: Calling all_inventory to load vars for managed-node1 13271 1727203830.10785: Calling groups_inventory to load vars for managed-node1 13271 1727203830.10787: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.10793: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.10795: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.10796: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.11518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.13042: done with get_vars() 13271 1727203830.13063: done getting variables 13271 1727203830.13110: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:50:30 -0400 (0:00:01.109) 0:00:13.774 ***** 13271 1727203830.13138: entering _queue_task() for managed-node1/debug 13271 1727203830.13369: worker is 1 (out of 1 available) 13271 1727203830.13385: exiting _queue_task() for managed-node1/debug 13271 1727203830.13399: done queuing things up, now waiting for results queue to drain 13271 1727203830.13400: waiting for pending results... 13271 1727203830.13564: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 13271 1727203830.13644: in run() - task 028d2410-947f-2a40-12ba-000000000027 13271 1727203830.13656: variable 'ansible_search_path' from source: unknown 13271 1727203830.13659: variable 'ansible_search_path' from source: unknown 13271 1727203830.13692: calling self._execute() 13271 1727203830.13754: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.13760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.13771: variable 'omit' from source: magic vars 13271 1727203830.14031: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.14039: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.14045: variable 'omit' from source: magic vars 13271 1727203830.14088: variable 'omit' from source: magic vars 13271 1727203830.14153: variable 'network_provider' from source: set_fact 13271 1727203830.14176: variable 'omit' from source: magic vars 13271 1727203830.14204: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203830.14231: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203830.14247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203830.14261: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203830.14275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203830.14300: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203830.14304: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.14306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.14370: Set connection var ansible_connection to ssh 13271 1727203830.14379: Set connection var ansible_shell_type to sh 13271 1727203830.14387: Set connection var ansible_timeout to 10 13271 1727203830.14390: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203830.14400: Set connection var ansible_pipelining to False 13271 1727203830.14402: Set connection var ansible_shell_executable to /bin/sh 13271 1727203830.14441: variable 'ansible_shell_executable' from source: unknown 13271 1727203830.14444: variable 'ansible_connection' from source: unknown 13271 1727203830.14447: variable 'ansible_module_compression' from source: unknown 13271 1727203830.14449: variable 'ansible_shell_type' from source: unknown 13271 1727203830.14451: variable 'ansible_shell_executable' from source: unknown 13271 1727203830.14460: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.14462: variable 'ansible_pipelining' from source: unknown 13271 1727203830.14469: variable 'ansible_timeout' from source: unknown 13271 1727203830.14471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.14634: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203830.14637: variable 'omit' from source: magic vars 13271 1727203830.14640: starting attempt loop 13271 1727203830.14643: running the handler 13271 1727203830.14690: handler run complete 13271 1727203830.14694: attempt loop complete, returning result 13271 1727203830.14696: _execute() done 13271 1727203830.14699: dumping result to json 13271 1727203830.14702: done dumping result, returning 13271 1727203830.14704: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-2a40-12ba-000000000027] 13271 1727203830.14706: sending task result for task 028d2410-947f-2a40-12ba-000000000027 13271 1727203830.14786: done sending task result for task 028d2410-947f-2a40-12ba-000000000027 13271 1727203830.14789: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 13271 1727203830.14843: no more pending results, returning what we have 13271 1727203830.14846: results queue empty 13271 1727203830.14847: checking for any_errors_fatal 13271 1727203830.14856: done checking for any_errors_fatal 13271 1727203830.14857: checking for max_fail_percentage 13271 1727203830.14858: done checking for max_fail_percentage 13271 1727203830.14859: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.14860: done checking to see if all hosts have failed 13271 1727203830.14860: getting the remaining hosts for this loop 13271 1727203830.14862: done getting the remaining hosts for this loop 13271 1727203830.14866: getting the next task for host managed-node1 13271 1727203830.14871: done getting next task for host managed-node1 13271 1727203830.14876: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13271 1727203830.14880: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.14891: getting variables 13271 1727203830.14893: in VariableManager get_vars() 13271 1727203830.14929: Calling all_inventory to load vars for managed-node1 13271 1727203830.14932: Calling groups_inventory to load vars for managed-node1 13271 1727203830.14934: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.14943: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.14945: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.14947: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.16210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.17390: done with get_vars() 13271 1727203830.17410: done getting variables 13271 1727203830.17484: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.043) 0:00:13.818 ***** 13271 1727203830.17508: entering _queue_task() for managed-node1/fail 13271 1727203830.17509: Creating lock for fail 13271 1727203830.17748: worker is 1 (out of 1 available) 13271 1727203830.17761: exiting _queue_task() for managed-node1/fail 13271 1727203830.17778: done queuing things up, now waiting for results queue to drain 13271 1727203830.17780: waiting for pending results... 13271 1727203830.17948: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13271 1727203830.18031: in run() - task 028d2410-947f-2a40-12ba-000000000028 13271 1727203830.18042: variable 'ansible_search_path' from source: unknown 13271 1727203830.18046: variable 'ansible_search_path' from source: unknown 13271 1727203830.18077: calling self._execute() 13271 1727203830.18143: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.18146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.18154: variable 'omit' from source: magic vars 13271 1727203830.18420: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.18427: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.18513: variable 'network_state' from source: role '' defaults 13271 1727203830.18521: Evaluated conditional (network_state != {}): False 13271 1727203830.18524: when evaluation is False, skipping this task 13271 1727203830.18526: _execute() done 13271 1727203830.18529: dumping result to json 13271 1727203830.18532: done dumping result, returning 13271 1727203830.18539: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-2a40-12ba-000000000028] 13271 1727203830.18545: sending task result for task 028d2410-947f-2a40-12ba-000000000028 13271 1727203830.18627: done sending task result for task 028d2410-947f-2a40-12ba-000000000028 13271 1727203830.18630: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203830.18724: no more pending results, returning what we have 13271 1727203830.18728: results queue empty 13271 1727203830.18729: checking for any_errors_fatal 13271 1727203830.18734: done checking for any_errors_fatal 13271 1727203830.18734: checking for max_fail_percentage 13271 1727203830.18736: done checking for max_fail_percentage 13271 1727203830.18737: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.18738: done checking to see if all hosts have failed 13271 1727203830.18738: getting the remaining hosts for this loop 13271 1727203830.18740: done getting the remaining hosts for this loop 13271 1727203830.18743: getting the next task for host managed-node1 13271 1727203830.18749: done getting next task for host managed-node1 13271 1727203830.18752: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13271 1727203830.18755: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.18775: getting variables 13271 1727203830.18781: in VariableManager get_vars() 13271 1727203830.18815: Calling all_inventory to load vars for managed-node1 13271 1727203830.18818: Calling groups_inventory to load vars for managed-node1 13271 1727203830.18820: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.18828: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.18830: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.18833: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.20093: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.21079: done with get_vars() 13271 1727203830.21096: done getting variables 13271 1727203830.21140: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.036) 0:00:13.854 ***** 13271 1727203830.21170: entering _queue_task() for managed-node1/fail 13271 1727203830.21418: worker is 1 (out of 1 available) 13271 1727203830.21430: exiting _queue_task() for managed-node1/fail 13271 1727203830.21442: done queuing things up, now waiting for results queue to drain 13271 1727203830.21444: waiting for pending results... 13271 1727203830.21609: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13271 1727203830.21687: in run() - task 028d2410-947f-2a40-12ba-000000000029 13271 1727203830.21698: variable 'ansible_search_path' from source: unknown 13271 1727203830.21702: variable 'ansible_search_path' from source: unknown 13271 1727203830.21730: calling self._execute() 13271 1727203830.21796: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.21800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.21808: variable 'omit' from source: magic vars 13271 1727203830.22064: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.22071: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.22154: variable 'network_state' from source: role '' defaults 13271 1727203830.22164: Evaluated conditional (network_state != {}): False 13271 1727203830.22168: when evaluation is False, skipping this task 13271 1727203830.22170: _execute() done 13271 1727203830.22172: dumping result to json 13271 1727203830.22177: done dumping result, returning 13271 1727203830.22181: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-2a40-12ba-000000000029] 13271 1727203830.22187: sending task result for task 028d2410-947f-2a40-12ba-000000000029 13271 1727203830.22271: done sending task result for task 028d2410-947f-2a40-12ba-000000000029 13271 1727203830.22274: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203830.22318: no more pending results, returning what we have 13271 1727203830.22322: results queue empty 13271 1727203830.22323: checking for any_errors_fatal 13271 1727203830.22330: done checking for any_errors_fatal 13271 1727203830.22331: checking for max_fail_percentage 13271 1727203830.22332: done checking for max_fail_percentage 13271 1727203830.22333: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.22334: done checking to see if all hosts have failed 13271 1727203830.22335: getting the remaining hosts for this loop 13271 1727203830.22336: done getting the remaining hosts for this loop 13271 1727203830.22339: getting the next task for host managed-node1 13271 1727203830.22346: done getting next task for host managed-node1 13271 1727203830.22349: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13271 1727203830.22352: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.22369: getting variables 13271 1727203830.22371: in VariableManager get_vars() 13271 1727203830.22410: Calling all_inventory to load vars for managed-node1 13271 1727203830.22412: Calling groups_inventory to load vars for managed-node1 13271 1727203830.22414: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.22423: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.22425: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.22427: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.23189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.24059: done with get_vars() 13271 1727203830.24085: done getting variables 13271 1727203830.24130: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.029) 0:00:13.884 ***** 13271 1727203830.24155: entering _queue_task() for managed-node1/fail 13271 1727203830.24407: worker is 1 (out of 1 available) 13271 1727203830.24421: exiting _queue_task() for managed-node1/fail 13271 1727203830.24433: done queuing things up, now waiting for results queue to drain 13271 1727203830.24435: waiting for pending results... 13271 1727203830.24600: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13271 1727203830.24679: in run() - task 028d2410-947f-2a40-12ba-00000000002a 13271 1727203830.24690: variable 'ansible_search_path' from source: unknown 13271 1727203830.24693: variable 'ansible_search_path' from source: unknown 13271 1727203830.24722: calling self._execute() 13271 1727203830.24788: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.24792: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.24800: variable 'omit' from source: magic vars 13271 1727203830.25054: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.25066: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.25186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203830.26671: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203830.26715: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203830.26744: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203830.26770: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203830.26790: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203830.26848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.26870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.26890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.26916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.26926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.26997: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.27009: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13271 1727203830.27093: variable 'ansible_distribution' from source: facts 13271 1727203830.27097: variable '__network_rh_distros' from source: role '' defaults 13271 1727203830.27105: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13271 1727203830.27257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.27282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.27297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.27322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.27333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.27367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.27383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.27404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.27429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.27439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.27469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.27487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.27507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.27530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.27540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.27731: variable 'network_connections' from source: task vars 13271 1727203830.27739: variable 'controller_profile' from source: play vars 13271 1727203830.27786: variable 'controller_profile' from source: play vars 13271 1727203830.27794: variable 'controller_device' from source: play vars 13271 1727203830.27838: variable 'controller_device' from source: play vars 13271 1727203830.27846: variable 'port1_profile' from source: play vars 13271 1727203830.27889: variable 'port1_profile' from source: play vars 13271 1727203830.27895: variable 'dhcp_interface1' from source: play vars 13271 1727203830.27939: variable 'dhcp_interface1' from source: play vars 13271 1727203830.27943: variable 'controller_profile' from source: play vars 13271 1727203830.27985: variable 'controller_profile' from source: play vars 13271 1727203830.27991: variable 'port2_profile' from source: play vars 13271 1727203830.28031: variable 'port2_profile' from source: play vars 13271 1727203830.28038: variable 'dhcp_interface2' from source: play vars 13271 1727203830.28082: variable 'dhcp_interface2' from source: play vars 13271 1727203830.28088: variable 'controller_profile' from source: play vars 13271 1727203830.28389: variable 'controller_profile' from source: play vars 13271 1727203830.28396: variable 'network_state' from source: role '' defaults 13271 1727203830.28444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203830.28565: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203830.28599: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203830.28619: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203830.28639: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203830.28674: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203830.28691: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203830.28713: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.28730: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203830.28760: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13271 1727203830.28767: when evaluation is False, skipping this task 13271 1727203830.28770: _execute() done 13271 1727203830.28772: dumping result to json 13271 1727203830.28774: done dumping result, returning 13271 1727203830.28779: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-2a40-12ba-00000000002a] 13271 1727203830.28784: sending task result for task 028d2410-947f-2a40-12ba-00000000002a 13271 1727203830.28873: done sending task result for task 028d2410-947f-2a40-12ba-00000000002a 13271 1727203830.28878: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13271 1727203830.28955: no more pending results, returning what we have 13271 1727203830.28958: results queue empty 13271 1727203830.28959: checking for any_errors_fatal 13271 1727203830.28965: done checking for any_errors_fatal 13271 1727203830.28966: checking for max_fail_percentage 13271 1727203830.28968: done checking for max_fail_percentage 13271 1727203830.28969: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.28970: done checking to see if all hosts have failed 13271 1727203830.28970: getting the remaining hosts for this loop 13271 1727203830.28972: done getting the remaining hosts for this loop 13271 1727203830.28977: getting the next task for host managed-node1 13271 1727203830.28982: done getting next task for host managed-node1 13271 1727203830.28986: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13271 1727203830.28988: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.29002: getting variables 13271 1727203830.29003: in VariableManager get_vars() 13271 1727203830.29039: Calling all_inventory to load vars for managed-node1 13271 1727203830.29042: Calling groups_inventory to load vars for managed-node1 13271 1727203830.29044: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.29052: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.29055: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.29058: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.30139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.31700: done with get_vars() 13271 1727203830.31728: done getting variables 13271 1727203830.31837: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.077) 0:00:13.961 ***** 13271 1727203830.31877: entering _queue_task() for managed-node1/dnf 13271 1727203830.32221: worker is 1 (out of 1 available) 13271 1727203830.32234: exiting _queue_task() for managed-node1/dnf 13271 1727203830.32246: done queuing things up, now waiting for results queue to drain 13271 1727203830.32247: waiting for pending results... 13271 1727203830.32696: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13271 1727203830.32702: in run() - task 028d2410-947f-2a40-12ba-00000000002b 13271 1727203830.32706: variable 'ansible_search_path' from source: unknown 13271 1727203830.32708: variable 'ansible_search_path' from source: unknown 13271 1727203830.32745: calling self._execute() 13271 1727203830.32847: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.32860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.32883: variable 'omit' from source: magic vars 13271 1727203830.33258: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.33280: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.33489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203830.35514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203830.35565: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203830.35591: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203830.35617: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203830.35635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203830.35698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.35738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.35765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.35797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.35808: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.35897: variable 'ansible_distribution' from source: facts 13271 1727203830.35900: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.35913: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13271 1727203830.35993: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203830.36083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.36098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.36114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.36141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.36151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.36182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.36201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.36217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.36242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.36252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.36284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.36302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.36319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.36344: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.36354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.36458: variable 'network_connections' from source: task vars 13271 1727203830.36470: variable 'controller_profile' from source: play vars 13271 1727203830.36518: variable 'controller_profile' from source: play vars 13271 1727203830.36524: variable 'controller_device' from source: play vars 13271 1727203830.36567: variable 'controller_device' from source: play vars 13271 1727203830.36578: variable 'port1_profile' from source: play vars 13271 1727203830.36619: variable 'port1_profile' from source: play vars 13271 1727203830.36622: variable 'dhcp_interface1' from source: play vars 13271 1727203830.36667: variable 'dhcp_interface1' from source: play vars 13271 1727203830.36674: variable 'controller_profile' from source: play vars 13271 1727203830.36715: variable 'controller_profile' from source: play vars 13271 1727203830.36721: variable 'port2_profile' from source: play vars 13271 1727203830.36763: variable 'port2_profile' from source: play vars 13271 1727203830.36776: variable 'dhcp_interface2' from source: play vars 13271 1727203830.36814: variable 'dhcp_interface2' from source: play vars 13271 1727203830.36819: variable 'controller_profile' from source: play vars 13271 1727203830.36861: variable 'controller_profile' from source: play vars 13271 1727203830.36925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203830.37037: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203830.37069: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203830.37094: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203830.37116: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203830.37147: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203830.37162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203830.37186: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.37207: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203830.37301: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203830.37516: variable 'network_connections' from source: task vars 13271 1727203830.37519: variable 'controller_profile' from source: play vars 13271 1727203830.37550: variable 'controller_profile' from source: play vars 13271 1727203830.37780: variable 'controller_device' from source: play vars 13271 1727203830.37783: variable 'controller_device' from source: play vars 13271 1727203830.37785: variable 'port1_profile' from source: play vars 13271 1727203830.37787: variable 'port1_profile' from source: play vars 13271 1727203830.37788: variable 'dhcp_interface1' from source: play vars 13271 1727203830.37790: variable 'dhcp_interface1' from source: play vars 13271 1727203830.37791: variable 'controller_profile' from source: play vars 13271 1727203830.37832: variable 'controller_profile' from source: play vars 13271 1727203830.37844: variable 'port2_profile' from source: play vars 13271 1727203830.37908: variable 'port2_profile' from source: play vars 13271 1727203830.37919: variable 'dhcp_interface2' from source: play vars 13271 1727203830.37981: variable 'dhcp_interface2' from source: play vars 13271 1727203830.37992: variable 'controller_profile' from source: play vars 13271 1727203830.38051: variable 'controller_profile' from source: play vars 13271 1727203830.38093: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13271 1727203830.38101: when evaluation is False, skipping this task 13271 1727203830.38109: _execute() done 13271 1727203830.38117: dumping result to json 13271 1727203830.38125: done dumping result, returning 13271 1727203830.38137: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-2a40-12ba-00000000002b] 13271 1727203830.38147: sending task result for task 028d2410-947f-2a40-12ba-00000000002b 13271 1727203830.38256: done sending task result for task 028d2410-947f-2a40-12ba-00000000002b 13271 1727203830.38268: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13271 1727203830.38344: no more pending results, returning what we have 13271 1727203830.38348: results queue empty 13271 1727203830.38348: checking for any_errors_fatal 13271 1727203830.38354: done checking for any_errors_fatal 13271 1727203830.38355: checking for max_fail_percentage 13271 1727203830.38357: done checking for max_fail_percentage 13271 1727203830.38357: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.38358: done checking to see if all hosts have failed 13271 1727203830.38359: getting the remaining hosts for this loop 13271 1727203830.38360: done getting the remaining hosts for this loop 13271 1727203830.38364: getting the next task for host managed-node1 13271 1727203830.38370: done getting next task for host managed-node1 13271 1727203830.38374: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13271 1727203830.38389: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.38405: getting variables 13271 1727203830.38407: in VariableManager get_vars() 13271 1727203830.38446: Calling all_inventory to load vars for managed-node1 13271 1727203830.38449: Calling groups_inventory to load vars for managed-node1 13271 1727203830.38451: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.38461: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.38464: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.38466: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.39323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.40185: done with get_vars() 13271 1727203830.40203: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13271 1727203830.40259: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.084) 0:00:14.046 ***** 13271 1727203830.40284: entering _queue_task() for managed-node1/yum 13271 1727203830.40286: Creating lock for yum 13271 1727203830.40583: worker is 1 (out of 1 available) 13271 1727203830.40595: exiting _queue_task() for managed-node1/yum 13271 1727203830.40607: done queuing things up, now waiting for results queue to drain 13271 1727203830.40609: waiting for pending results... 13271 1727203830.40874: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13271 1727203830.40988: in run() - task 028d2410-947f-2a40-12ba-00000000002c 13271 1727203830.41004: variable 'ansible_search_path' from source: unknown 13271 1727203830.41011: variable 'ansible_search_path' from source: unknown 13271 1727203830.41046: calling self._execute() 13271 1727203830.41130: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.41142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.41156: variable 'omit' from source: magic vars 13271 1727203830.41496: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.41580: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.41671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203830.44039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203830.44109: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203830.44147: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203830.44187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203830.44216: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203830.44298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.44331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.44362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.44418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.44436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.44786: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.44789: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13271 1727203830.44792: when evaluation is False, skipping this task 13271 1727203830.44794: _execute() done 13271 1727203830.44796: dumping result to json 13271 1727203830.44798: done dumping result, returning 13271 1727203830.44801: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-2a40-12ba-00000000002c] 13271 1727203830.44803: sending task result for task 028d2410-947f-2a40-12ba-00000000002c 13271 1727203830.44870: done sending task result for task 028d2410-947f-2a40-12ba-00000000002c 13271 1727203830.44874: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13271 1727203830.44919: no more pending results, returning what we have 13271 1727203830.44922: results queue empty 13271 1727203830.44923: checking for any_errors_fatal 13271 1727203830.44927: done checking for any_errors_fatal 13271 1727203830.44927: checking for max_fail_percentage 13271 1727203830.44929: done checking for max_fail_percentage 13271 1727203830.44930: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.44931: done checking to see if all hosts have failed 13271 1727203830.44931: getting the remaining hosts for this loop 13271 1727203830.44932: done getting the remaining hosts for this loop 13271 1727203830.44935: getting the next task for host managed-node1 13271 1727203830.44941: done getting next task for host managed-node1 13271 1727203830.44945: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13271 1727203830.44948: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.44961: getting variables 13271 1727203830.44965: in VariableManager get_vars() 13271 1727203830.45005: Calling all_inventory to load vars for managed-node1 13271 1727203830.45008: Calling groups_inventory to load vars for managed-node1 13271 1727203830.45009: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.45018: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.45025: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.45027: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.46699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.48098: done with get_vars() 13271 1727203830.48116: done getting variables 13271 1727203830.48160: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.079) 0:00:14.125 ***** 13271 1727203830.48188: entering _queue_task() for managed-node1/fail 13271 1727203830.48426: worker is 1 (out of 1 available) 13271 1727203830.48440: exiting _queue_task() for managed-node1/fail 13271 1727203830.48453: done queuing things up, now waiting for results queue to drain 13271 1727203830.48455: waiting for pending results... 13271 1727203830.48624: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13271 1727203830.48706: in run() - task 028d2410-947f-2a40-12ba-00000000002d 13271 1727203830.48717: variable 'ansible_search_path' from source: unknown 13271 1727203830.48721: variable 'ansible_search_path' from source: unknown 13271 1727203830.48749: calling self._execute() 13271 1727203830.48819: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.48824: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.48834: variable 'omit' from source: magic vars 13271 1727203830.49095: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.49104: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.49186: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203830.49315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203830.51251: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203830.51296: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203830.51323: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203830.51351: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203830.51370: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203830.51432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.51460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.51477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.51503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.51513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.51546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.51569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.51585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.51610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.51620: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.51648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.51672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.51687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.51711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.51721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.51835: variable 'network_connections' from source: task vars 13271 1727203830.51845: variable 'controller_profile' from source: play vars 13271 1727203830.51899: variable 'controller_profile' from source: play vars 13271 1727203830.51908: variable 'controller_device' from source: play vars 13271 1727203830.51951: variable 'controller_device' from source: play vars 13271 1727203830.51960: variable 'port1_profile' from source: play vars 13271 1727203830.52005: variable 'port1_profile' from source: play vars 13271 1727203830.52012: variable 'dhcp_interface1' from source: play vars 13271 1727203830.52054: variable 'dhcp_interface1' from source: play vars 13271 1727203830.52059: variable 'controller_profile' from source: play vars 13271 1727203830.52104: variable 'controller_profile' from source: play vars 13271 1727203830.52108: variable 'port2_profile' from source: play vars 13271 1727203830.52151: variable 'port2_profile' from source: play vars 13271 1727203830.52157: variable 'dhcp_interface2' from source: play vars 13271 1727203830.52202: variable 'dhcp_interface2' from source: play vars 13271 1727203830.52208: variable 'controller_profile' from source: play vars 13271 1727203830.52251: variable 'controller_profile' from source: play vars 13271 1727203830.52302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203830.52429: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203830.52459: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203830.52483: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203830.52504: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203830.52536: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203830.52554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203830.52573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.52593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203830.52653: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203830.53081: variable 'network_connections' from source: task vars 13271 1727203830.53085: variable 'controller_profile' from source: play vars 13271 1727203830.53087: variable 'controller_profile' from source: play vars 13271 1727203830.53089: variable 'controller_device' from source: play vars 13271 1727203830.53091: variable 'controller_device' from source: play vars 13271 1727203830.53093: variable 'port1_profile' from source: play vars 13271 1727203830.53094: variable 'port1_profile' from source: play vars 13271 1727203830.53096: variable 'dhcp_interface1' from source: play vars 13271 1727203830.53145: variable 'dhcp_interface1' from source: play vars 13271 1727203830.53157: variable 'controller_profile' from source: play vars 13271 1727203830.53218: variable 'controller_profile' from source: play vars 13271 1727203830.53231: variable 'port2_profile' from source: play vars 13271 1727203830.53299: variable 'port2_profile' from source: play vars 13271 1727203830.53310: variable 'dhcp_interface2' from source: play vars 13271 1727203830.53369: variable 'dhcp_interface2' from source: play vars 13271 1727203830.53406: variable 'controller_profile' from source: play vars 13271 1727203830.53472: variable 'controller_profile' from source: play vars 13271 1727203830.53529: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13271 1727203830.53537: when evaluation is False, skipping this task 13271 1727203830.53543: _execute() done 13271 1727203830.53549: dumping result to json 13271 1727203830.53556: done dumping result, returning 13271 1727203830.53570: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-2a40-12ba-00000000002d] 13271 1727203830.53581: sending task result for task 028d2410-947f-2a40-12ba-00000000002d 13271 1727203830.53694: done sending task result for task 028d2410-947f-2a40-12ba-00000000002d 13271 1727203830.53704: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13271 1727203830.53757: no more pending results, returning what we have 13271 1727203830.53760: results queue empty 13271 1727203830.53760: checking for any_errors_fatal 13271 1727203830.53767: done checking for any_errors_fatal 13271 1727203830.53768: checking for max_fail_percentage 13271 1727203830.53770: done checking for max_fail_percentage 13271 1727203830.53771: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.53772: done checking to see if all hosts have failed 13271 1727203830.53773: getting the remaining hosts for this loop 13271 1727203830.53774: done getting the remaining hosts for this loop 13271 1727203830.53947: getting the next task for host managed-node1 13271 1727203830.53955: done getting next task for host managed-node1 13271 1727203830.53959: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13271 1727203830.53965: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.53997: getting variables 13271 1727203830.53999: in VariableManager get_vars() 13271 1727203830.54037: Calling all_inventory to load vars for managed-node1 13271 1727203830.54039: Calling groups_inventory to load vars for managed-node1 13271 1727203830.54041: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.54054: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.54057: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.54059: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.54886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.55880: done with get_vars() 13271 1727203830.55900: done getting variables 13271 1727203830.55954: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.077) 0:00:14.203 ***** 13271 1727203830.55985: entering _queue_task() for managed-node1/package 13271 1727203830.56283: worker is 1 (out of 1 available) 13271 1727203830.56295: exiting _queue_task() for managed-node1/package 13271 1727203830.56310: done queuing things up, now waiting for results queue to drain 13271 1727203830.56311: waiting for pending results... 13271 1727203830.56681: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 13271 1727203830.56694: in run() - task 028d2410-947f-2a40-12ba-00000000002e 13271 1727203830.56714: variable 'ansible_search_path' from source: unknown 13271 1727203830.56718: variable 'ansible_search_path' from source: unknown 13271 1727203830.56765: calling self._execute() 13271 1727203830.56841: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.56845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.56847: variable 'omit' from source: magic vars 13271 1727203830.57167: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.57230: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.57359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203830.57781: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203830.57784: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203830.57787: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203830.57789: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203830.57833: variable 'network_packages' from source: role '' defaults 13271 1727203830.57940: variable '__network_provider_setup' from source: role '' defaults 13271 1727203830.57955: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203830.58020: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203830.58038: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203830.58102: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203830.58220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203830.59527: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203830.59577: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203830.59604: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203830.59628: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203830.59646: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203830.59714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.59734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.59751: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.59785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.59797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.59828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.59843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.59878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.60080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.60083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.60169: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13271 1727203830.60306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.60333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.60362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.60415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.60434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.60528: variable 'ansible_python' from source: facts 13271 1727203830.60556: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13271 1727203830.60674: variable '__network_wpa_supplicant_required' from source: role '' defaults 13271 1727203830.60796: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13271 1727203830.60904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.60920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.60936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.60965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.60979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.61012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.61031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.61047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.61080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.61091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.61193: variable 'network_connections' from source: task vars 13271 1727203830.61198: variable 'controller_profile' from source: play vars 13271 1727203830.61265: variable 'controller_profile' from source: play vars 13271 1727203830.61280: variable 'controller_device' from source: play vars 13271 1727203830.61344: variable 'controller_device' from source: play vars 13271 1727203830.61353: variable 'port1_profile' from source: play vars 13271 1727203830.61428: variable 'port1_profile' from source: play vars 13271 1727203830.61436: variable 'dhcp_interface1' from source: play vars 13271 1727203830.61509: variable 'dhcp_interface1' from source: play vars 13271 1727203830.61517: variable 'controller_profile' from source: play vars 13271 1727203830.61585: variable 'controller_profile' from source: play vars 13271 1727203830.61593: variable 'port2_profile' from source: play vars 13271 1727203830.61662: variable 'port2_profile' from source: play vars 13271 1727203830.61672: variable 'dhcp_interface2' from source: play vars 13271 1727203830.61741: variable 'dhcp_interface2' from source: play vars 13271 1727203830.61748: variable 'controller_profile' from source: play vars 13271 1727203830.61819: variable 'controller_profile' from source: play vars 13271 1727203830.61872: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203830.61894: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203830.61914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.61934: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203830.61979: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203830.62153: variable 'network_connections' from source: task vars 13271 1727203830.62165: variable 'controller_profile' from source: play vars 13271 1727203830.62234: variable 'controller_profile' from source: play vars 13271 1727203830.62241: variable 'controller_device' from source: play vars 13271 1727203830.62324: variable 'controller_device' from source: play vars 13271 1727203830.62333: variable 'port1_profile' from source: play vars 13271 1727203830.62405: variable 'port1_profile' from source: play vars 13271 1727203830.62412: variable 'dhcp_interface1' from source: play vars 13271 1727203830.62481: variable 'dhcp_interface1' from source: play vars 13271 1727203830.62486: variable 'controller_profile' from source: play vars 13271 1727203830.62552: variable 'controller_profile' from source: play vars 13271 1727203830.62560: variable 'port2_profile' from source: play vars 13271 1727203830.62629: variable 'port2_profile' from source: play vars 13271 1727203830.62637: variable 'dhcp_interface2' from source: play vars 13271 1727203830.62706: variable 'dhcp_interface2' from source: play vars 13271 1727203830.62718: variable 'controller_profile' from source: play vars 13271 1727203830.62780: variable 'controller_profile' from source: play vars 13271 1727203830.62819: variable '__network_packages_default_wireless' from source: role '' defaults 13271 1727203830.62873: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203830.63078: variable 'network_connections' from source: task vars 13271 1727203830.63081: variable 'controller_profile' from source: play vars 13271 1727203830.63125: variable 'controller_profile' from source: play vars 13271 1727203830.63131: variable 'controller_device' from source: play vars 13271 1727203830.63180: variable 'controller_device' from source: play vars 13271 1727203830.63187: variable 'port1_profile' from source: play vars 13271 1727203830.63231: variable 'port1_profile' from source: play vars 13271 1727203830.63236: variable 'dhcp_interface1' from source: play vars 13271 1727203830.63284: variable 'dhcp_interface1' from source: play vars 13271 1727203830.63290: variable 'controller_profile' from source: play vars 13271 1727203830.63333: variable 'controller_profile' from source: play vars 13271 1727203830.63339: variable 'port2_profile' from source: play vars 13271 1727203830.63387: variable 'port2_profile' from source: play vars 13271 1727203830.63393: variable 'dhcp_interface2' from source: play vars 13271 1727203830.63437: variable 'dhcp_interface2' from source: play vars 13271 1727203830.63442: variable 'controller_profile' from source: play vars 13271 1727203830.63490: variable 'controller_profile' from source: play vars 13271 1727203830.63508: variable '__network_packages_default_team' from source: role '' defaults 13271 1727203830.63559: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203830.63751: variable 'network_connections' from source: task vars 13271 1727203830.63755: variable 'controller_profile' from source: play vars 13271 1727203830.63803: variable 'controller_profile' from source: play vars 13271 1727203830.63806: variable 'controller_device' from source: play vars 13271 1727203830.63852: variable 'controller_device' from source: play vars 13271 1727203830.63860: variable 'port1_profile' from source: play vars 13271 1727203830.63906: variable 'port1_profile' from source: play vars 13271 1727203830.63911: variable 'dhcp_interface1' from source: play vars 13271 1727203830.63958: variable 'dhcp_interface1' from source: play vars 13271 1727203830.63964: variable 'controller_profile' from source: play vars 13271 1727203830.64008: variable 'controller_profile' from source: play vars 13271 1727203830.64014: variable 'port2_profile' from source: play vars 13271 1727203830.64060: variable 'port2_profile' from source: play vars 13271 1727203830.64066: variable 'dhcp_interface2' from source: play vars 13271 1727203830.64111: variable 'dhcp_interface2' from source: play vars 13271 1727203830.64116: variable 'controller_profile' from source: play vars 13271 1727203830.64161: variable 'controller_profile' from source: play vars 13271 1727203830.64208: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203830.64247: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203830.64261: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203830.64306: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203830.64444: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13271 1727203830.64743: variable 'network_connections' from source: task vars 13271 1727203830.64746: variable 'controller_profile' from source: play vars 13271 1727203830.64794: variable 'controller_profile' from source: play vars 13271 1727203830.64804: variable 'controller_device' from source: play vars 13271 1727203830.64839: variable 'controller_device' from source: play vars 13271 1727203830.64846: variable 'port1_profile' from source: play vars 13271 1727203830.64890: variable 'port1_profile' from source: play vars 13271 1727203830.64901: variable 'dhcp_interface1' from source: play vars 13271 1727203830.64938: variable 'dhcp_interface1' from source: play vars 13271 1727203830.64944: variable 'controller_profile' from source: play vars 13271 1727203830.64987: variable 'controller_profile' from source: play vars 13271 1727203830.64993: variable 'port2_profile' from source: play vars 13271 1727203830.65035: variable 'port2_profile' from source: play vars 13271 1727203830.65041: variable 'dhcp_interface2' from source: play vars 13271 1727203830.65085: variable 'dhcp_interface2' from source: play vars 13271 1727203830.65090: variable 'controller_profile' from source: play vars 13271 1727203830.65135: variable 'controller_profile' from source: play vars 13271 1727203830.65138: variable 'ansible_distribution' from source: facts 13271 1727203830.65143: variable '__network_rh_distros' from source: role '' defaults 13271 1727203830.65148: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.65171: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13271 1727203830.65288: variable 'ansible_distribution' from source: facts 13271 1727203830.65291: variable '__network_rh_distros' from source: role '' defaults 13271 1727203830.65296: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.65308: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13271 1727203830.65416: variable 'ansible_distribution' from source: facts 13271 1727203830.65420: variable '__network_rh_distros' from source: role '' defaults 13271 1727203830.65424: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.65457: variable 'network_provider' from source: set_fact 13271 1727203830.65471: variable 'ansible_facts' from source: unknown 13271 1727203830.65901: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13271 1727203830.65904: when evaluation is False, skipping this task 13271 1727203830.65907: _execute() done 13271 1727203830.65909: dumping result to json 13271 1727203830.65911: done dumping result, returning 13271 1727203830.65919: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-2a40-12ba-00000000002e] 13271 1727203830.65924: sending task result for task 028d2410-947f-2a40-12ba-00000000002e 13271 1727203830.66009: done sending task result for task 028d2410-947f-2a40-12ba-00000000002e 13271 1727203830.66012: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13271 1727203830.66063: no more pending results, returning what we have 13271 1727203830.66066: results queue empty 13271 1727203830.66067: checking for any_errors_fatal 13271 1727203830.66072: done checking for any_errors_fatal 13271 1727203830.66073: checking for max_fail_percentage 13271 1727203830.66077: done checking for max_fail_percentage 13271 1727203830.66078: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.66079: done checking to see if all hosts have failed 13271 1727203830.66079: getting the remaining hosts for this loop 13271 1727203830.66080: done getting the remaining hosts for this loop 13271 1727203830.66084: getting the next task for host managed-node1 13271 1727203830.66089: done getting next task for host managed-node1 13271 1727203830.66092: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13271 1727203830.66095: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.66109: getting variables 13271 1727203830.66111: in VariableManager get_vars() 13271 1727203830.66156: Calling all_inventory to load vars for managed-node1 13271 1727203830.66159: Calling groups_inventory to load vars for managed-node1 13271 1727203830.66161: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.66172: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.66174: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.66179: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.66992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.67871: done with get_vars() 13271 1727203830.67896: done getting variables 13271 1727203830.67941: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.119) 0:00:14.322 ***** 13271 1727203830.67970: entering _queue_task() for managed-node1/package 13271 1727203830.68226: worker is 1 (out of 1 available) 13271 1727203830.68239: exiting _queue_task() for managed-node1/package 13271 1727203830.68252: done queuing things up, now waiting for results queue to drain 13271 1727203830.68254: waiting for pending results... 13271 1727203830.68424: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13271 1727203830.68507: in run() - task 028d2410-947f-2a40-12ba-00000000002f 13271 1727203830.68518: variable 'ansible_search_path' from source: unknown 13271 1727203830.68521: variable 'ansible_search_path' from source: unknown 13271 1727203830.68552: calling self._execute() 13271 1727203830.68621: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.68625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.68633: variable 'omit' from source: magic vars 13271 1727203830.68899: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.68908: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.68993: variable 'network_state' from source: role '' defaults 13271 1727203830.69001: Evaluated conditional (network_state != {}): False 13271 1727203830.69004: when evaluation is False, skipping this task 13271 1727203830.69007: _execute() done 13271 1727203830.69009: dumping result to json 13271 1727203830.69012: done dumping result, returning 13271 1727203830.69020: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-2a40-12ba-00000000002f] 13271 1727203830.69023: sending task result for task 028d2410-947f-2a40-12ba-00000000002f 13271 1727203830.69115: done sending task result for task 028d2410-947f-2a40-12ba-00000000002f 13271 1727203830.69118: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203830.69186: no more pending results, returning what we have 13271 1727203830.69190: results queue empty 13271 1727203830.69190: checking for any_errors_fatal 13271 1727203830.69195: done checking for any_errors_fatal 13271 1727203830.69196: checking for max_fail_percentage 13271 1727203830.69197: done checking for max_fail_percentage 13271 1727203830.69198: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.69199: done checking to see if all hosts have failed 13271 1727203830.69200: getting the remaining hosts for this loop 13271 1727203830.69201: done getting the remaining hosts for this loop 13271 1727203830.69204: getting the next task for host managed-node1 13271 1727203830.69211: done getting next task for host managed-node1 13271 1727203830.69215: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13271 1727203830.69218: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.69233: getting variables 13271 1727203830.69235: in VariableManager get_vars() 13271 1727203830.69274: Calling all_inventory to load vars for managed-node1 13271 1727203830.69278: Calling groups_inventory to load vars for managed-node1 13271 1727203830.69281: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.69290: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.69292: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.69294: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.70188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.74251: done with get_vars() 13271 1727203830.74279: done getting variables 13271 1727203830.74317: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.063) 0:00:14.386 ***** 13271 1727203830.74340: entering _queue_task() for managed-node1/package 13271 1727203830.74604: worker is 1 (out of 1 available) 13271 1727203830.74618: exiting _queue_task() for managed-node1/package 13271 1727203830.74631: done queuing things up, now waiting for results queue to drain 13271 1727203830.74633: waiting for pending results... 13271 1727203830.74802: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13271 1727203830.74891: in run() - task 028d2410-947f-2a40-12ba-000000000030 13271 1727203830.74902: variable 'ansible_search_path' from source: unknown 13271 1727203830.74906: variable 'ansible_search_path' from source: unknown 13271 1727203830.74934: calling self._execute() 13271 1727203830.75001: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.75005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.75013: variable 'omit' from source: magic vars 13271 1727203830.75290: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.75303: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.75384: variable 'network_state' from source: role '' defaults 13271 1727203830.75392: Evaluated conditional (network_state != {}): False 13271 1727203830.75395: when evaluation is False, skipping this task 13271 1727203830.75398: _execute() done 13271 1727203830.75402: dumping result to json 13271 1727203830.75405: done dumping result, returning 13271 1727203830.75416: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-2a40-12ba-000000000030] 13271 1727203830.75419: sending task result for task 028d2410-947f-2a40-12ba-000000000030 13271 1727203830.75509: done sending task result for task 028d2410-947f-2a40-12ba-000000000030 13271 1727203830.75514: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203830.75561: no more pending results, returning what we have 13271 1727203830.75567: results queue empty 13271 1727203830.75567: checking for any_errors_fatal 13271 1727203830.75577: done checking for any_errors_fatal 13271 1727203830.75578: checking for max_fail_percentage 13271 1727203830.75579: done checking for max_fail_percentage 13271 1727203830.75580: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.75581: done checking to see if all hosts have failed 13271 1727203830.75582: getting the remaining hosts for this loop 13271 1727203830.75583: done getting the remaining hosts for this loop 13271 1727203830.75587: getting the next task for host managed-node1 13271 1727203830.75593: done getting next task for host managed-node1 13271 1727203830.75596: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13271 1727203830.75599: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.75614: getting variables 13271 1727203830.75616: in VariableManager get_vars() 13271 1727203830.75657: Calling all_inventory to load vars for managed-node1 13271 1727203830.75659: Calling groups_inventory to load vars for managed-node1 13271 1727203830.75663: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.75672: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.75674: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.75684: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.76458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.77356: done with get_vars() 13271 1727203830.77378: done getting variables 13271 1727203830.77456: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.031) 0:00:14.418 ***** 13271 1727203830.77485: entering _queue_task() for managed-node1/service 13271 1727203830.77487: Creating lock for service 13271 1727203830.77749: worker is 1 (out of 1 available) 13271 1727203830.77771: exiting _queue_task() for managed-node1/service 13271 1727203830.77791: done queuing things up, now waiting for results queue to drain 13271 1727203830.77793: waiting for pending results... 13271 1727203830.77968: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13271 1727203830.78107: in run() - task 028d2410-947f-2a40-12ba-000000000031 13271 1727203830.78112: variable 'ansible_search_path' from source: unknown 13271 1727203830.78114: variable 'ansible_search_path' from source: unknown 13271 1727203830.78183: calling self._execute() 13271 1727203830.78235: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.78247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.78262: variable 'omit' from source: magic vars 13271 1727203830.78652: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.78669: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.78790: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203830.79055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203830.80970: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203830.81482: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203830.81486: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203830.81489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203830.81491: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203830.81531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.81566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.81600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.81643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.81658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.81707: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.81738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.81768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.81815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.81837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.81893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.81916: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.81938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.81972: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.81991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.82147: variable 'network_connections' from source: task vars 13271 1727203830.82162: variable 'controller_profile' from source: play vars 13271 1727203830.82237: variable 'controller_profile' from source: play vars 13271 1727203830.82250: variable 'controller_device' from source: play vars 13271 1727203830.82309: variable 'controller_device' from source: play vars 13271 1727203830.82323: variable 'port1_profile' from source: play vars 13271 1727203830.82383: variable 'port1_profile' from source: play vars 13271 1727203830.82406: variable 'dhcp_interface1' from source: play vars 13271 1727203830.82461: variable 'dhcp_interface1' from source: play vars 13271 1727203830.82481: variable 'controller_profile' from source: play vars 13271 1727203830.82535: variable 'controller_profile' from source: play vars 13271 1727203830.82680: variable 'port2_profile' from source: play vars 13271 1727203830.82682: variable 'port2_profile' from source: play vars 13271 1727203830.82685: variable 'dhcp_interface2' from source: play vars 13271 1727203830.82687: variable 'dhcp_interface2' from source: play vars 13271 1727203830.82689: variable 'controller_profile' from source: play vars 13271 1727203830.82736: variable 'controller_profile' from source: play vars 13271 1727203830.82809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203830.82975: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203830.83015: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203830.83060: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203830.83091: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203830.83133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203830.83155: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203830.83183: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.83214: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203830.83292: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203830.83525: variable 'network_connections' from source: task vars 13271 1727203830.83535: variable 'controller_profile' from source: play vars 13271 1727203830.83597: variable 'controller_profile' from source: play vars 13271 1727203830.83609: variable 'controller_device' from source: play vars 13271 1727203830.83669: variable 'controller_device' from source: play vars 13271 1727203830.83685: variable 'port1_profile' from source: play vars 13271 1727203830.83739: variable 'port1_profile' from source: play vars 13271 1727203830.83749: variable 'dhcp_interface1' from source: play vars 13271 1727203830.83806: variable 'dhcp_interface1' from source: play vars 13271 1727203830.83818: variable 'controller_profile' from source: play vars 13271 1727203830.83880: variable 'controller_profile' from source: play vars 13271 1727203830.84080: variable 'port2_profile' from source: play vars 13271 1727203830.84083: variable 'port2_profile' from source: play vars 13271 1727203830.84085: variable 'dhcp_interface2' from source: play vars 13271 1727203830.84086: variable 'dhcp_interface2' from source: play vars 13271 1727203830.84088: variable 'controller_profile' from source: play vars 13271 1727203830.84089: variable 'controller_profile' from source: play vars 13271 1727203830.84115: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13271 1727203830.84122: when evaluation is False, skipping this task 13271 1727203830.84127: _execute() done 13271 1727203830.84132: dumping result to json 13271 1727203830.84138: done dumping result, returning 13271 1727203830.84149: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-2a40-12ba-000000000031] 13271 1727203830.84158: sending task result for task 028d2410-947f-2a40-12ba-000000000031 skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13271 1727203830.84305: no more pending results, returning what we have 13271 1727203830.84308: results queue empty 13271 1727203830.84308: checking for any_errors_fatal 13271 1727203830.84314: done checking for any_errors_fatal 13271 1727203830.84315: checking for max_fail_percentage 13271 1727203830.84316: done checking for max_fail_percentage 13271 1727203830.84317: checking to see if all hosts have failed and the running result is not ok 13271 1727203830.84318: done checking to see if all hosts have failed 13271 1727203830.84319: getting the remaining hosts for this loop 13271 1727203830.84320: done getting the remaining hosts for this loop 13271 1727203830.84323: getting the next task for host managed-node1 13271 1727203830.84329: done getting next task for host managed-node1 13271 1727203830.84333: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13271 1727203830.84335: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203830.84348: getting variables 13271 1727203830.84350: in VariableManager get_vars() 13271 1727203830.84393: Calling all_inventory to load vars for managed-node1 13271 1727203830.84396: Calling groups_inventory to load vars for managed-node1 13271 1727203830.84398: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203830.84408: Calling all_plugins_play to load vars for managed-node1 13271 1727203830.84410: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203830.84413: Calling groups_plugins_play to load vars for managed-node1 13271 1727203830.85384: done sending task result for task 028d2410-947f-2a40-12ba-000000000031 13271 1727203830.85388: WORKER PROCESS EXITING 13271 1727203830.87448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203830.90890: done with get_vars() 13271 1727203830.90924: done getting variables 13271 1727203830.91099: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:50:30 -0400 (0:00:00.136) 0:00:14.554 ***** 13271 1727203830.91134: entering _queue_task() for managed-node1/service 13271 1727203830.91769: worker is 1 (out of 1 available) 13271 1727203830.91784: exiting _queue_task() for managed-node1/service 13271 1727203830.91797: done queuing things up, now waiting for results queue to drain 13271 1727203830.91799: waiting for pending results... 13271 1727203830.92003: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13271 1727203830.92159: in run() - task 028d2410-947f-2a40-12ba-000000000032 13271 1727203830.92182: variable 'ansible_search_path' from source: unknown 13271 1727203830.92193: variable 'ansible_search_path' from source: unknown 13271 1727203830.92242: calling self._execute() 13271 1727203830.92345: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203830.92359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203830.92378: variable 'omit' from source: magic vars 13271 1727203830.92791: variable 'ansible_distribution_major_version' from source: facts 13271 1727203830.92809: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203830.93046: variable 'network_provider' from source: set_fact 13271 1727203830.93221: variable 'network_state' from source: role '' defaults 13271 1727203830.93315: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13271 1727203830.93382: variable 'omit' from source: magic vars 13271 1727203830.93445: variable 'omit' from source: magic vars 13271 1727203830.93485: variable 'network_service_name' from source: role '' defaults 13271 1727203830.93572: variable 'network_service_name' from source: role '' defaults 13271 1727203830.93715: variable '__network_provider_setup' from source: role '' defaults 13271 1727203830.93727: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203830.93801: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203830.93814: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203830.93887: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203830.94118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203830.95926: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203830.95983: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203830.96011: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203830.96040: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203830.96059: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203830.96123: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.96146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.96168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.96196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.96206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.96243: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.96263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.96282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.96311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.96351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.96781: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13271 1727203830.96784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.96787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.96789: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.96791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.96793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.97392: variable 'ansible_python' from source: facts 13271 1727203830.97395: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13271 1727203830.97398: variable '__network_wpa_supplicant_required' from source: role '' defaults 13271 1727203830.97467: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13271 1727203830.97706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.97738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.97794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.97857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.97881: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.97935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203830.97973: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203830.98007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.98049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203830.98067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203830.98209: variable 'network_connections' from source: task vars 13271 1727203830.98224: variable 'controller_profile' from source: play vars 13271 1727203830.98303: variable 'controller_profile' from source: play vars 13271 1727203830.98320: variable 'controller_device' from source: play vars 13271 1727203830.98394: variable 'controller_device' from source: play vars 13271 1727203830.98443: variable 'port1_profile' from source: play vars 13271 1727203830.98564: variable 'port1_profile' from source: play vars 13271 1727203830.98588: variable 'dhcp_interface1' from source: play vars 13271 1727203830.98640: variable 'dhcp_interface1' from source: play vars 13271 1727203830.98669: variable 'controller_profile' from source: play vars 13271 1727203830.98719: variable 'controller_profile' from source: play vars 13271 1727203830.98728: variable 'port2_profile' from source: play vars 13271 1727203830.98779: variable 'port2_profile' from source: play vars 13271 1727203830.98793: variable 'dhcp_interface2' from source: play vars 13271 1727203830.98839: variable 'dhcp_interface2' from source: play vars 13271 1727203830.98847: variable 'controller_profile' from source: play vars 13271 1727203830.98902: variable 'controller_profile' from source: play vars 13271 1727203830.98972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203830.99119: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203830.99154: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203830.99188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203830.99217: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203830.99265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203830.99286: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203830.99310: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203830.99337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203830.99371: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203830.99554: variable 'network_connections' from source: task vars 13271 1727203830.99557: variable 'controller_profile' from source: play vars 13271 1727203830.99634: variable 'controller_profile' from source: play vars 13271 1727203830.99639: variable 'controller_device' from source: play vars 13271 1727203830.99880: variable 'controller_device' from source: play vars 13271 1727203830.99883: variable 'port1_profile' from source: play vars 13271 1727203830.99889: variable 'port1_profile' from source: play vars 13271 1727203830.99892: variable 'dhcp_interface1' from source: play vars 13271 1727203830.99894: variable 'dhcp_interface1' from source: play vars 13271 1727203830.99896: variable 'controller_profile' from source: play vars 13271 1727203830.99945: variable 'controller_profile' from source: play vars 13271 1727203830.99963: variable 'port2_profile' from source: play vars 13271 1727203831.00043: variable 'port2_profile' from source: play vars 13271 1727203831.00067: variable 'dhcp_interface2' from source: play vars 13271 1727203831.00160: variable 'dhcp_interface2' from source: play vars 13271 1727203831.00187: variable 'controller_profile' from source: play vars 13271 1727203831.00366: variable 'controller_profile' from source: play vars 13271 1727203831.00421: variable '__network_packages_default_wireless' from source: role '' defaults 13271 1727203831.00520: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203831.00849: variable 'network_connections' from source: task vars 13271 1727203831.00871: variable 'controller_profile' from source: play vars 13271 1727203831.00947: variable 'controller_profile' from source: play vars 13271 1727203831.00961: variable 'controller_device' from source: play vars 13271 1727203831.01052: variable 'controller_device' from source: play vars 13271 1727203831.01069: variable 'port1_profile' from source: play vars 13271 1727203831.01199: variable 'port1_profile' from source: play vars 13271 1727203831.01202: variable 'dhcp_interface1' from source: play vars 13271 1727203831.01250: variable 'dhcp_interface1' from source: play vars 13271 1727203831.01265: variable 'controller_profile' from source: play vars 13271 1727203831.01361: variable 'controller_profile' from source: play vars 13271 1727203831.01378: variable 'port2_profile' from source: play vars 13271 1727203831.01523: variable 'port2_profile' from source: play vars 13271 1727203831.01526: variable 'dhcp_interface2' from source: play vars 13271 1727203831.01548: variable 'dhcp_interface2' from source: play vars 13271 1727203831.01561: variable 'controller_profile' from source: play vars 13271 1727203831.01643: variable 'controller_profile' from source: play vars 13271 1727203831.01679: variable '__network_packages_default_team' from source: role '' defaults 13271 1727203831.01783: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203831.02151: variable 'network_connections' from source: task vars 13271 1727203831.02165: variable 'controller_profile' from source: play vars 13271 1727203831.02312: variable 'controller_profile' from source: play vars 13271 1727203831.02324: variable 'controller_device' from source: play vars 13271 1727203831.02515: variable 'controller_device' from source: play vars 13271 1727203831.02519: variable 'port1_profile' from source: play vars 13271 1727203831.02521: variable 'port1_profile' from source: play vars 13271 1727203831.02523: variable 'dhcp_interface1' from source: play vars 13271 1727203831.02643: variable 'dhcp_interface1' from source: play vars 13271 1727203831.02646: variable 'controller_profile' from source: play vars 13271 1727203831.02692: variable 'controller_profile' from source: play vars 13271 1727203831.02697: variable 'port2_profile' from source: play vars 13271 1727203831.02750: variable 'port2_profile' from source: play vars 13271 1727203831.02765: variable 'dhcp_interface2' from source: play vars 13271 1727203831.02809: variable 'dhcp_interface2' from source: play vars 13271 1727203831.02814: variable 'controller_profile' from source: play vars 13271 1727203831.02870: variable 'controller_profile' from source: play vars 13271 1727203831.02912: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203831.02954: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203831.02959: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203831.03005: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203831.03138: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13271 1727203831.03443: variable 'network_connections' from source: task vars 13271 1727203831.03447: variable 'controller_profile' from source: play vars 13271 1727203831.03490: variable 'controller_profile' from source: play vars 13271 1727203831.03497: variable 'controller_device' from source: play vars 13271 1727203831.03539: variable 'controller_device' from source: play vars 13271 1727203831.03545: variable 'port1_profile' from source: play vars 13271 1727203831.03587: variable 'port1_profile' from source: play vars 13271 1727203831.03594: variable 'dhcp_interface1' from source: play vars 13271 1727203831.03634: variable 'dhcp_interface1' from source: play vars 13271 1727203831.03639: variable 'controller_profile' from source: play vars 13271 1727203831.03680: variable 'controller_profile' from source: play vars 13271 1727203831.03686: variable 'port2_profile' from source: play vars 13271 1727203831.03727: variable 'port2_profile' from source: play vars 13271 1727203831.03733: variable 'dhcp_interface2' from source: play vars 13271 1727203831.03776: variable 'dhcp_interface2' from source: play vars 13271 1727203831.03781: variable 'controller_profile' from source: play vars 13271 1727203831.03821: variable 'controller_profile' from source: play vars 13271 1727203831.03828: variable 'ansible_distribution' from source: facts 13271 1727203831.03831: variable '__network_rh_distros' from source: role '' defaults 13271 1727203831.03837: variable 'ansible_distribution_major_version' from source: facts 13271 1727203831.03860: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13271 1727203831.04061: variable 'ansible_distribution' from source: facts 13271 1727203831.04064: variable '__network_rh_distros' from source: role '' defaults 13271 1727203831.04073: variable 'ansible_distribution_major_version' from source: facts 13271 1727203831.04108: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13271 1727203831.04265: variable 'ansible_distribution' from source: facts 13271 1727203831.04268: variable '__network_rh_distros' from source: role '' defaults 13271 1727203831.04270: variable 'ansible_distribution_major_version' from source: facts 13271 1727203831.04288: variable 'network_provider' from source: set_fact 13271 1727203831.04365: variable 'omit' from source: magic vars 13271 1727203831.04368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203831.04423: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203831.04426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203831.04428: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203831.04430: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203831.04453: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203831.04456: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203831.04458: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203831.04648: Set connection var ansible_connection to ssh 13271 1727203831.04745: Set connection var ansible_shell_type to sh 13271 1727203831.04748: Set connection var ansible_timeout to 10 13271 1727203831.04750: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203831.04753: Set connection var ansible_pipelining to False 13271 1727203831.04755: Set connection var ansible_shell_executable to /bin/sh 13271 1727203831.04757: variable 'ansible_shell_executable' from source: unknown 13271 1727203831.04759: variable 'ansible_connection' from source: unknown 13271 1727203831.04760: variable 'ansible_module_compression' from source: unknown 13271 1727203831.04762: variable 'ansible_shell_type' from source: unknown 13271 1727203831.04764: variable 'ansible_shell_executable' from source: unknown 13271 1727203831.04766: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203831.04768: variable 'ansible_pipelining' from source: unknown 13271 1727203831.04770: variable 'ansible_timeout' from source: unknown 13271 1727203831.04772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203831.04872: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203831.04900: variable 'omit' from source: magic vars 13271 1727203831.04909: starting attempt loop 13271 1727203831.04914: running the handler 13271 1727203831.05181: variable 'ansible_facts' from source: unknown 13271 1727203831.06004: _low_level_execute_command(): starting 13271 1727203831.06019: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203831.06730: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203831.06739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203831.06745: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203831.06793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203831.06831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203831.06844: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203831.06928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203831.08735: stdout chunk (state=3): >>>/root <<< 13271 1727203831.08906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203831.08910: stdout chunk (state=3): >>><<< 13271 1727203831.08912: stderr chunk (state=3): >>><<< 13271 1727203831.08932: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203831.09034: _low_level_execute_command(): starting 13271 1727203831.09039: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918 `" && echo ansible-tmp-1727203831.0894523-14435-102734513835918="` echo /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918 `" ) && sleep 0' 13271 1727203831.09572: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203831.09579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203831.09609: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203831.09615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203831.09662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203831.09666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203831.09668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203831.09755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203831.11831: stdout chunk (state=3): >>>ansible-tmp-1727203831.0894523-14435-102734513835918=/root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918 <<< 13271 1727203831.11933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203831.11966: stderr chunk (state=3): >>><<< 13271 1727203831.11970: stdout chunk (state=3): >>><<< 13271 1727203831.11989: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203831.0894523-14435-102734513835918=/root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203831.12016: variable 'ansible_module_compression' from source: unknown 13271 1727203831.12063: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 13271 1727203831.12069: ANSIBALLZ: Acquiring lock 13271 1727203831.12072: ANSIBALLZ: Lock acquired: 140497830695696 13271 1727203831.12074: ANSIBALLZ: Creating module 13271 1727203831.41551: ANSIBALLZ: Writing module into payload 13271 1727203831.41712: ANSIBALLZ: Writing module 13271 1727203831.41768: ANSIBALLZ: Renaming module 13271 1727203831.41778: ANSIBALLZ: Done creating module 13271 1727203831.41807: variable 'ansible_facts' from source: unknown 13271 1727203831.42038: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/AnsiballZ_systemd.py 13271 1727203831.42234: Sending initial data 13271 1727203831.42238: Sent initial data (156 bytes) 13271 1727203831.42983: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203831.42989: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203831.43282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203831.43622: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203831.45357: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13271 1727203831.45408: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203831.45670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203831.45739: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpjxs9th_0 /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/AnsiballZ_systemd.py <<< 13271 1727203831.45742: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/AnsiballZ_systemd.py" <<< 13271 1727203831.45935: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpjxs9th_0" to remote "/root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/AnsiballZ_systemd.py" <<< 13271 1727203831.48342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203831.48378: stderr chunk (state=3): >>><<< 13271 1727203831.48394: stdout chunk (state=3): >>><<< 13271 1727203831.48450: done transferring module to remote 13271 1727203831.48465: _low_level_execute_command(): starting 13271 1727203831.48495: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/ /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/AnsiballZ_systemd.py && sleep 0' 13271 1727203831.49156: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203831.49217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203831.49233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203831.49497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203831.49604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203831.51609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203831.51622: stdout chunk (state=3): >>><<< 13271 1727203831.51635: stderr chunk (state=3): >>><<< 13271 1727203831.51656: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203831.51665: _low_level_execute_command(): starting 13271 1727203831.51678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/AnsiballZ_systemd.py && sleep 0' 13271 1727203831.52284: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203831.52302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203831.52317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203831.52337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203831.52355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203831.52373: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203831.52402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203831.52423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203831.52436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203831.52448: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203831.52465: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203831.52484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203831.52502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203831.52589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203831.52601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203831.52792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203831.83973: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9957376", "MemoryPeak": "10485760", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3282898944", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "318482000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRec<<< 13271 1727203831.84017: stdout chunk (state=3): >>>eive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13271 1727203831.86167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203831.86190: stdout chunk (state=3): >>><<< 13271 1727203831.86210: stderr chunk (state=3): >>><<< 13271 1727203831.86233: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "9957376", "MemoryPeak": "10485760", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3282898944", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "318482000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203831.86445: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203831.86477: _low_level_execute_command(): starting 13271 1727203831.86488: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203831.0894523-14435-102734513835918/ > /dev/null 2>&1 && sleep 0' 13271 1727203831.87121: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203831.87140: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203831.87154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203831.87172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203831.87190: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203831.87201: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203831.87214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203831.87243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 13271 1727203831.87290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203831.87355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203831.87366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203831.87382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203831.87492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203831.89582: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203831.89586: stderr chunk (state=3): >>><<< 13271 1727203831.89588: stdout chunk (state=3): >>><<< 13271 1727203831.89591: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203831.89593: handler run complete 13271 1727203831.89634: attempt loop complete, returning result 13271 1727203831.89637: _execute() done 13271 1727203831.89640: dumping result to json 13271 1727203831.89658: done dumping result, returning 13271 1727203831.89669: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-2a40-12ba-000000000032] 13271 1727203831.89672: sending task result for task 028d2410-947f-2a40-12ba-000000000032 13271 1727203831.89946: done sending task result for task 028d2410-947f-2a40-12ba-000000000032 13271 1727203831.89950: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203831.90006: no more pending results, returning what we have 13271 1727203831.90009: results queue empty 13271 1727203831.90010: checking for any_errors_fatal 13271 1727203831.90017: done checking for any_errors_fatal 13271 1727203831.90018: checking for max_fail_percentage 13271 1727203831.90019: done checking for max_fail_percentage 13271 1727203831.90020: checking to see if all hosts have failed and the running result is not ok 13271 1727203831.90021: done checking to see if all hosts have failed 13271 1727203831.90022: getting the remaining hosts for this loop 13271 1727203831.90023: done getting the remaining hosts for this loop 13271 1727203831.90027: getting the next task for host managed-node1 13271 1727203831.90034: done getting next task for host managed-node1 13271 1727203831.90038: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13271 1727203831.90041: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203831.90051: getting variables 13271 1727203831.90053: in VariableManager get_vars() 13271 1727203831.90092: Calling all_inventory to load vars for managed-node1 13271 1727203831.90214: Calling groups_inventory to load vars for managed-node1 13271 1727203831.90218: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203831.90228: Calling all_plugins_play to load vars for managed-node1 13271 1727203831.90231: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203831.90234: Calling groups_plugins_play to load vars for managed-node1 13271 1727203831.91732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203831.93500: done with get_vars() 13271 1727203831.93522: done getting variables 13271 1727203831.93599: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:50:31 -0400 (0:00:01.024) 0:00:15.579 ***** 13271 1727203831.93637: entering _queue_task() for managed-node1/service 13271 1727203831.93985: worker is 1 (out of 1 available) 13271 1727203831.93998: exiting _queue_task() for managed-node1/service 13271 1727203831.94011: done queuing things up, now waiting for results queue to drain 13271 1727203831.94013: waiting for pending results... 13271 1727203831.94362: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13271 1727203831.94459: in run() - task 028d2410-947f-2a40-12ba-000000000033 13271 1727203831.94462: variable 'ansible_search_path' from source: unknown 13271 1727203831.94465: variable 'ansible_search_path' from source: unknown 13271 1727203831.94467: calling self._execute() 13271 1727203831.94545: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203831.94556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203831.94573: variable 'omit' from source: magic vars 13271 1727203831.94911: variable 'ansible_distribution_major_version' from source: facts 13271 1727203831.94925: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203831.95038: variable 'network_provider' from source: set_fact 13271 1727203831.95048: Evaluated conditional (network_provider == "nm"): True 13271 1727203831.95137: variable '__network_wpa_supplicant_required' from source: role '' defaults 13271 1727203831.95382: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13271 1727203831.95406: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203831.97411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203831.97486: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203831.97527: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203831.97570: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203831.97604: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203831.97702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203831.97738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203831.97766: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203831.97817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203831.97834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203831.97887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203831.97919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203831.97949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203831.98000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203831.98020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203831.98065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203831.98101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203831.98180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203831.98184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203831.98193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203831.98351: variable 'network_connections' from source: task vars 13271 1727203831.98368: variable 'controller_profile' from source: play vars 13271 1727203831.98448: variable 'controller_profile' from source: play vars 13271 1727203831.98465: variable 'controller_device' from source: play vars 13271 1727203831.98533: variable 'controller_device' from source: play vars 13271 1727203831.98548: variable 'port1_profile' from source: play vars 13271 1727203831.98612: variable 'port1_profile' from source: play vars 13271 1727203831.98644: variable 'dhcp_interface1' from source: play vars 13271 1727203831.98694: variable 'dhcp_interface1' from source: play vars 13271 1727203831.98706: variable 'controller_profile' from source: play vars 13271 1727203831.98862: variable 'controller_profile' from source: play vars 13271 1727203831.98865: variable 'port2_profile' from source: play vars 13271 1727203831.98868: variable 'port2_profile' from source: play vars 13271 1727203831.98870: variable 'dhcp_interface2' from source: play vars 13271 1727203831.98915: variable 'dhcp_interface2' from source: play vars 13271 1727203831.98927: variable 'controller_profile' from source: play vars 13271 1727203831.98992: variable 'controller_profile' from source: play vars 13271 1727203831.99065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203831.99239: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203831.99281: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203831.99320: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203831.99352: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203831.99404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203831.99432: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203831.99462: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203831.99506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203831.99560: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203831.99830: variable 'network_connections' from source: task vars 13271 1727203831.99833: variable 'controller_profile' from source: play vars 13271 1727203831.99892: variable 'controller_profile' from source: play vars 13271 1727203831.99939: variable 'controller_device' from source: play vars 13271 1727203831.99968: variable 'controller_device' from source: play vars 13271 1727203831.99985: variable 'port1_profile' from source: play vars 13271 1727203832.00048: variable 'port1_profile' from source: play vars 13271 1727203832.00059: variable 'dhcp_interface1' from source: play vars 13271 1727203832.00113: variable 'dhcp_interface1' from source: play vars 13271 1727203832.00156: variable 'controller_profile' from source: play vars 13271 1727203832.00187: variable 'controller_profile' from source: play vars 13271 1727203832.00199: variable 'port2_profile' from source: play vars 13271 1727203832.00264: variable 'port2_profile' from source: play vars 13271 1727203832.00277: variable 'dhcp_interface2' from source: play vars 13271 1727203832.00332: variable 'dhcp_interface2' from source: play vars 13271 1727203832.00373: variable 'controller_profile' from source: play vars 13271 1727203832.00406: variable 'controller_profile' from source: play vars 13271 1727203832.00452: Evaluated conditional (__network_wpa_supplicant_required): False 13271 1727203832.00460: when evaluation is False, skipping this task 13271 1727203832.00467: _execute() done 13271 1727203832.00680: dumping result to json 13271 1727203832.00685: done dumping result, returning 13271 1727203832.00689: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-2a40-12ba-000000000033] 13271 1727203832.00692: sending task result for task 028d2410-947f-2a40-12ba-000000000033 13271 1727203832.00759: done sending task result for task 028d2410-947f-2a40-12ba-000000000033 13271 1727203832.00762: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13271 1727203832.00806: no more pending results, returning what we have 13271 1727203832.00809: results queue empty 13271 1727203832.00810: checking for any_errors_fatal 13271 1727203832.00827: done checking for any_errors_fatal 13271 1727203832.00827: checking for max_fail_percentage 13271 1727203832.00829: done checking for max_fail_percentage 13271 1727203832.00830: checking to see if all hosts have failed and the running result is not ok 13271 1727203832.00831: done checking to see if all hosts have failed 13271 1727203832.00832: getting the remaining hosts for this loop 13271 1727203832.00833: done getting the remaining hosts for this loop 13271 1727203832.00836: getting the next task for host managed-node1 13271 1727203832.00841: done getting next task for host managed-node1 13271 1727203832.00845: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13271 1727203832.00847: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203832.00859: getting variables 13271 1727203832.00861: in VariableManager get_vars() 13271 1727203832.00901: Calling all_inventory to load vars for managed-node1 13271 1727203832.00904: Calling groups_inventory to load vars for managed-node1 13271 1727203832.00906: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203832.00915: Calling all_plugins_play to load vars for managed-node1 13271 1727203832.00917: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203832.00919: Calling groups_plugins_play to load vars for managed-node1 13271 1727203832.02232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203832.03782: done with get_vars() 13271 1727203832.03804: done getting variables 13271 1727203832.03858: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:50:32 -0400 (0:00:00.102) 0:00:15.682 ***** 13271 1727203832.03893: entering _queue_task() for managed-node1/service 13271 1727203832.04197: worker is 1 (out of 1 available) 13271 1727203832.04209: exiting _queue_task() for managed-node1/service 13271 1727203832.04219: done queuing things up, now waiting for results queue to drain 13271 1727203832.04221: waiting for pending results... 13271 1727203832.04497: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 13271 1727203832.04614: in run() - task 028d2410-947f-2a40-12ba-000000000034 13271 1727203832.04636: variable 'ansible_search_path' from source: unknown 13271 1727203832.04644: variable 'ansible_search_path' from source: unknown 13271 1727203832.04699: calling self._execute() 13271 1727203832.04810: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203832.04882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203832.04886: variable 'omit' from source: magic vars 13271 1727203832.05233: variable 'ansible_distribution_major_version' from source: facts 13271 1727203832.05254: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203832.05381: variable 'network_provider' from source: set_fact 13271 1727203832.05393: Evaluated conditional (network_provider == "initscripts"): False 13271 1727203832.05401: when evaluation is False, skipping this task 13271 1727203832.05408: _execute() done 13271 1727203832.05415: dumping result to json 13271 1727203832.05423: done dumping result, returning 13271 1727203832.05432: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-2a40-12ba-000000000034] 13271 1727203832.05443: sending task result for task 028d2410-947f-2a40-12ba-000000000034 skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203832.05722: no more pending results, returning what we have 13271 1727203832.05726: results queue empty 13271 1727203832.05727: checking for any_errors_fatal 13271 1727203832.05736: done checking for any_errors_fatal 13271 1727203832.05737: checking for max_fail_percentage 13271 1727203832.05739: done checking for max_fail_percentage 13271 1727203832.05739: checking to see if all hosts have failed and the running result is not ok 13271 1727203832.05740: done checking to see if all hosts have failed 13271 1727203832.05741: getting the remaining hosts for this loop 13271 1727203832.05742: done getting the remaining hosts for this loop 13271 1727203832.05746: getting the next task for host managed-node1 13271 1727203832.05753: done getting next task for host managed-node1 13271 1727203832.05757: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13271 1727203832.05760: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203832.05781: getting variables 13271 1727203832.05783: in VariableManager get_vars() 13271 1727203832.05825: Calling all_inventory to load vars for managed-node1 13271 1727203832.05828: Calling groups_inventory to load vars for managed-node1 13271 1727203832.05830: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203832.05844: Calling all_plugins_play to load vars for managed-node1 13271 1727203832.05847: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203832.05850: Calling groups_plugins_play to load vars for managed-node1 13271 1727203832.06388: done sending task result for task 028d2410-947f-2a40-12ba-000000000034 13271 1727203832.06391: WORKER PROCESS EXITING 13271 1727203832.07446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203832.08340: done with get_vars() 13271 1727203832.08354: done getting variables 13271 1727203832.08399: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:50:32 -0400 (0:00:00.045) 0:00:15.727 ***** 13271 1727203832.08422: entering _queue_task() for managed-node1/copy 13271 1727203832.08637: worker is 1 (out of 1 available) 13271 1727203832.08650: exiting _queue_task() for managed-node1/copy 13271 1727203832.08665: done queuing things up, now waiting for results queue to drain 13271 1727203832.08667: waiting for pending results... 13271 1727203832.08833: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13271 1727203832.08920: in run() - task 028d2410-947f-2a40-12ba-000000000035 13271 1727203832.08930: variable 'ansible_search_path' from source: unknown 13271 1727203832.08934: variable 'ansible_search_path' from source: unknown 13271 1727203832.08965: calling self._execute() 13271 1727203832.09029: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203832.09033: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203832.09041: variable 'omit' from source: magic vars 13271 1727203832.09366: variable 'ansible_distribution_major_version' from source: facts 13271 1727203832.09383: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203832.09518: variable 'network_provider' from source: set_fact 13271 1727203832.09522: Evaluated conditional (network_provider == "initscripts"): False 13271 1727203832.09525: when evaluation is False, skipping this task 13271 1727203832.09527: _execute() done 13271 1727203832.09530: dumping result to json 13271 1727203832.09532: done dumping result, returning 13271 1727203832.09536: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-2a40-12ba-000000000035] 13271 1727203832.09538: sending task result for task 028d2410-947f-2a40-12ba-000000000035 13271 1727203832.09611: done sending task result for task 028d2410-947f-2a40-12ba-000000000035 13271 1727203832.09614: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13271 1727203832.09667: no more pending results, returning what we have 13271 1727203832.09671: results queue empty 13271 1727203832.09672: checking for any_errors_fatal 13271 1727203832.09699: done checking for any_errors_fatal 13271 1727203832.09700: checking for max_fail_percentage 13271 1727203832.09702: done checking for max_fail_percentage 13271 1727203832.09703: checking to see if all hosts have failed and the running result is not ok 13271 1727203832.09705: done checking to see if all hosts have failed 13271 1727203832.09705: getting the remaining hosts for this loop 13271 1727203832.09707: done getting the remaining hosts for this loop 13271 1727203832.09711: getting the next task for host managed-node1 13271 1727203832.09718: done getting next task for host managed-node1 13271 1727203832.09722: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13271 1727203832.09726: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203832.09741: getting variables 13271 1727203832.09743: in VariableManager get_vars() 13271 1727203832.09792: Calling all_inventory to load vars for managed-node1 13271 1727203832.09795: Calling groups_inventory to load vars for managed-node1 13271 1727203832.09798: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203832.09809: Calling all_plugins_play to load vars for managed-node1 13271 1727203832.09812: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203832.09814: Calling groups_plugins_play to load vars for managed-node1 13271 1727203832.10857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203832.11711: done with get_vars() 13271 1727203832.11726: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:50:32 -0400 (0:00:00.033) 0:00:15.761 ***** 13271 1727203832.11787: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 13271 1727203832.11789: Creating lock for fedora.linux_system_roles.network_connections 13271 1727203832.11999: worker is 1 (out of 1 available) 13271 1727203832.12013: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 13271 1727203832.12026: done queuing things up, now waiting for results queue to drain 13271 1727203832.12027: waiting for pending results... 13271 1727203832.12194: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13271 1727203832.12280: in run() - task 028d2410-947f-2a40-12ba-000000000036 13271 1727203832.12291: variable 'ansible_search_path' from source: unknown 13271 1727203832.12294: variable 'ansible_search_path' from source: unknown 13271 1727203832.12322: calling self._execute() 13271 1727203832.12402: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203832.12406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203832.12415: variable 'omit' from source: magic vars 13271 1727203832.12795: variable 'ansible_distribution_major_version' from source: facts 13271 1727203832.12799: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203832.12801: variable 'omit' from source: magic vars 13271 1727203832.12840: variable 'omit' from source: magic vars 13271 1727203832.12999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203832.15353: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203832.15416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203832.15451: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203832.15489: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203832.15514: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203832.15592: variable 'network_provider' from source: set_fact 13271 1727203832.15722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203832.15748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203832.15779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203832.15818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203832.15844: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203832.15951: variable 'omit' from source: magic vars 13271 1727203832.16017: variable 'omit' from source: magic vars 13271 1727203832.16115: variable 'network_connections' from source: task vars 13271 1727203832.16125: variable 'controller_profile' from source: play vars 13271 1727203832.16277: variable 'controller_profile' from source: play vars 13271 1727203832.16281: variable 'controller_device' from source: play vars 13271 1727203832.16283: variable 'controller_device' from source: play vars 13271 1727203832.16286: variable 'port1_profile' from source: play vars 13271 1727203832.16312: variable 'port1_profile' from source: play vars 13271 1727203832.16319: variable 'dhcp_interface1' from source: play vars 13271 1727203832.16382: variable 'dhcp_interface1' from source: play vars 13271 1727203832.16388: variable 'controller_profile' from source: play vars 13271 1727203832.16438: variable 'controller_profile' from source: play vars 13271 1727203832.16452: variable 'port2_profile' from source: play vars 13271 1727203832.16536: variable 'port2_profile' from source: play vars 13271 1727203832.16549: variable 'dhcp_interface2' from source: play vars 13271 1727203832.16616: variable 'dhcp_interface2' from source: play vars 13271 1727203832.16627: variable 'controller_profile' from source: play vars 13271 1727203832.16780: variable 'controller_profile' from source: play vars 13271 1727203832.16874: variable 'omit' from source: magic vars 13271 1727203832.16891: variable '__lsr_ansible_managed' from source: task vars 13271 1727203832.16950: variable '__lsr_ansible_managed' from source: task vars 13271 1727203832.17132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13271 1727203832.17343: Loaded config def from plugin (lookup/template) 13271 1727203832.17353: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13271 1727203832.17388: File lookup term: get_ansible_managed.j2 13271 1727203832.17396: variable 'ansible_search_path' from source: unknown 13271 1727203832.17405: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13271 1727203832.17421: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13271 1727203832.17442: variable 'ansible_search_path' from source: unknown 13271 1727203832.23199: variable 'ansible_managed' from source: unknown 13271 1727203832.23340: variable 'omit' from source: magic vars 13271 1727203832.23377: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203832.23409: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203832.23581: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203832.23584: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203832.23587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203832.23591: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203832.23594: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203832.23596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203832.23620: Set connection var ansible_connection to ssh 13271 1727203832.23634: Set connection var ansible_shell_type to sh 13271 1727203832.23648: Set connection var ansible_timeout to 10 13271 1727203832.23659: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203832.23673: Set connection var ansible_pipelining to False 13271 1727203832.23696: Set connection var ansible_shell_executable to /bin/sh 13271 1727203832.23727: variable 'ansible_shell_executable' from source: unknown 13271 1727203832.23736: variable 'ansible_connection' from source: unknown 13271 1727203832.23744: variable 'ansible_module_compression' from source: unknown 13271 1727203832.23751: variable 'ansible_shell_type' from source: unknown 13271 1727203832.23757: variable 'ansible_shell_executable' from source: unknown 13271 1727203832.23768: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203832.23779: variable 'ansible_pipelining' from source: unknown 13271 1727203832.23787: variable 'ansible_timeout' from source: unknown 13271 1727203832.23795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203832.23931: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203832.23947: variable 'omit' from source: magic vars 13271 1727203832.23958: starting attempt loop 13271 1727203832.23967: running the handler 13271 1727203832.23987: _low_level_execute_command(): starting 13271 1727203832.23998: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203832.24704: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203832.24723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203832.24792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.24841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203832.24855: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203832.24882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203832.25002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203832.26797: stdout chunk (state=3): >>>/root <<< 13271 1727203832.26915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203832.26928: stderr chunk (state=3): >>><<< 13271 1727203832.26932: stdout chunk (state=3): >>><<< 13271 1727203832.26952: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203832.26965: _low_level_execute_command(): starting 13271 1727203832.26968: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822 `" && echo ansible-tmp-1727203832.2695148-14497-7335642733822="` echo /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822 `" ) && sleep 0' 13271 1727203832.27425: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203832.27428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203832.27431: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.27433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 13271 1727203832.27435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203832.27437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.27481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203832.27484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203832.27489: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203832.27568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203832.29655: stdout chunk (state=3): >>>ansible-tmp-1727203832.2695148-14497-7335642733822=/root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822 <<< 13271 1727203832.29792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203832.29803: stderr chunk (state=3): >>><<< 13271 1727203832.29806: stdout chunk (state=3): >>><<< 13271 1727203832.29830: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203832.2695148-14497-7335642733822=/root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203832.29873: variable 'ansible_module_compression' from source: unknown 13271 1727203832.29913: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 13271 1727203832.29916: ANSIBALLZ: Acquiring lock 13271 1727203832.29918: ANSIBALLZ: Lock acquired: 140497825729520 13271 1727203832.29921: ANSIBALLZ: Creating module 13271 1727203832.45073: ANSIBALLZ: Writing module into payload 13271 1727203832.45284: ANSIBALLZ: Writing module 13271 1727203832.45305: ANSIBALLZ: Renaming module 13271 1727203832.45309: ANSIBALLZ: Done creating module 13271 1727203832.45330: variable 'ansible_facts' from source: unknown 13271 1727203832.45406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/AnsiballZ_network_connections.py 13271 1727203832.45508: Sending initial data 13271 1727203832.45512: Sent initial data (166 bytes) 13271 1727203832.45944: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203832.45948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203832.45950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.45952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203832.45954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.46012: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203832.46015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203832.46017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203832.46102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203832.47853: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 13271 1727203832.47865: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203832.47947: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203832.48023: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpfl95715x /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/AnsiballZ_network_connections.py <<< 13271 1727203832.48029: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/AnsiballZ_network_connections.py" <<< 13271 1727203832.48100: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpfl95715x" to remote "/root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/AnsiballZ_network_connections.py" <<< 13271 1727203832.48103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/AnsiballZ_network_connections.py" <<< 13271 1727203832.48958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203832.48997: stderr chunk (state=3): >>><<< 13271 1727203832.49000: stdout chunk (state=3): >>><<< 13271 1727203832.49024: done transferring module to remote 13271 1727203832.49033: _low_level_execute_command(): starting 13271 1727203832.49038: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/ /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/AnsiballZ_network_connections.py && sleep 0' 13271 1727203832.49451: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203832.49459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203832.49481: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.49486: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203832.49505: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203832.49508: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.49547: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203832.49550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203832.49635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203832.51611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203832.51634: stderr chunk (state=3): >>><<< 13271 1727203832.51637: stdout chunk (state=3): >>><<< 13271 1727203832.51651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203832.51658: _low_level_execute_command(): starting 13271 1727203832.51661: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/AnsiballZ_network_connections.py && sleep 0' 13271 1727203832.52080: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203832.52083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203832.52085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203832.52087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203832.52141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203832.52148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203832.52150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203832.52232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.01934: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13271 1727203833.04250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203833.04284: stderr chunk (state=3): >>><<< 13271 1727203833.04287: stdout chunk (state=3): >>><<< 13271 1727203833.04304: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203833.04346: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203833.04354: _low_level_execute_command(): starting 13271 1727203833.04360: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203832.2695148-14497-7335642733822/ > /dev/null 2>&1 && sleep 0' 13271 1727203833.04821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.04827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.04844: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.04888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.04895: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.04901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.04995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.07014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.07041: stderr chunk (state=3): >>><<< 13271 1727203833.07044: stdout chunk (state=3): >>><<< 13271 1727203833.07059: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.07066: handler run complete 13271 1727203833.07097: attempt loop complete, returning result 13271 1727203833.07100: _execute() done 13271 1727203833.07103: dumping result to json 13271 1727203833.07109: done dumping result, returning 13271 1727203833.07117: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-2a40-12ba-000000000036] 13271 1727203833.07121: sending task result for task 028d2410-947f-2a40-12ba-000000000036 13271 1727203833.07232: done sending task result for task 028d2410-947f-2a40-12ba-000000000036 13271 1727203833.07235: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35 (not-active) 13271 1727203833.07370: no more pending results, returning what we have 13271 1727203833.07373: results queue empty 13271 1727203833.07374: checking for any_errors_fatal 13271 1727203833.07382: done checking for any_errors_fatal 13271 1727203833.07383: checking for max_fail_percentage 13271 1727203833.07384: done checking for max_fail_percentage 13271 1727203833.07385: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.07386: done checking to see if all hosts have failed 13271 1727203833.07387: getting the remaining hosts for this loop 13271 1727203833.07389: done getting the remaining hosts for this loop 13271 1727203833.07392: getting the next task for host managed-node1 13271 1727203833.07397: done getting next task for host managed-node1 13271 1727203833.07400: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13271 1727203833.07402: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.07410: getting variables 13271 1727203833.07412: in VariableManager get_vars() 13271 1727203833.07448: Calling all_inventory to load vars for managed-node1 13271 1727203833.07450: Calling groups_inventory to load vars for managed-node1 13271 1727203833.07452: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.07469: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.07472: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.07474: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.08386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.09237: done with get_vars() 13271 1727203833.09252: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:50:33 -0400 (0:00:00.975) 0:00:16.736 ***** 13271 1727203833.09316: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 13271 1727203833.09317: Creating lock for fedora.linux_system_roles.network_state 13271 1727203833.09547: worker is 1 (out of 1 available) 13271 1727203833.09559: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 13271 1727203833.09570: done queuing things up, now waiting for results queue to drain 13271 1727203833.09572: waiting for pending results... 13271 1727203833.09741: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 13271 1727203833.09825: in run() - task 028d2410-947f-2a40-12ba-000000000037 13271 1727203833.09836: variable 'ansible_search_path' from source: unknown 13271 1727203833.09840: variable 'ansible_search_path' from source: unknown 13271 1727203833.09871: calling self._execute() 13271 1727203833.09938: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.09942: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.09951: variable 'omit' from source: magic vars 13271 1727203833.10216: variable 'ansible_distribution_major_version' from source: facts 13271 1727203833.10225: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203833.10310: variable 'network_state' from source: role '' defaults 13271 1727203833.10317: Evaluated conditional (network_state != {}): False 13271 1727203833.10320: when evaluation is False, skipping this task 13271 1727203833.10323: _execute() done 13271 1727203833.10326: dumping result to json 13271 1727203833.10328: done dumping result, returning 13271 1727203833.10335: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-2a40-12ba-000000000037] 13271 1727203833.10339: sending task result for task 028d2410-947f-2a40-12ba-000000000037 13271 1727203833.10416: done sending task result for task 028d2410-947f-2a40-12ba-000000000037 13271 1727203833.10418: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203833.10498: no more pending results, returning what we have 13271 1727203833.10502: results queue empty 13271 1727203833.10502: checking for any_errors_fatal 13271 1727203833.10511: done checking for any_errors_fatal 13271 1727203833.10512: checking for max_fail_percentage 13271 1727203833.10513: done checking for max_fail_percentage 13271 1727203833.10514: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.10515: done checking to see if all hosts have failed 13271 1727203833.10516: getting the remaining hosts for this loop 13271 1727203833.10517: done getting the remaining hosts for this loop 13271 1727203833.10520: getting the next task for host managed-node1 13271 1727203833.10524: done getting next task for host managed-node1 13271 1727203833.10527: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13271 1727203833.10530: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.10542: getting variables 13271 1727203833.10543: in VariableManager get_vars() 13271 1727203833.10574: Calling all_inventory to load vars for managed-node1 13271 1727203833.10579: Calling groups_inventory to load vars for managed-node1 13271 1727203833.10581: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.10589: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.10591: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.10593: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.11311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.12165: done with get_vars() 13271 1727203833.12181: done getting variables 13271 1727203833.12221: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:50:33 -0400 (0:00:00.029) 0:00:16.765 ***** 13271 1727203833.12245: entering _queue_task() for managed-node1/debug 13271 1727203833.12449: worker is 1 (out of 1 available) 13271 1727203833.12461: exiting _queue_task() for managed-node1/debug 13271 1727203833.12473: done queuing things up, now waiting for results queue to drain 13271 1727203833.12476: waiting for pending results... 13271 1727203833.12641: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13271 1727203833.12722: in run() - task 028d2410-947f-2a40-12ba-000000000038 13271 1727203833.12734: variable 'ansible_search_path' from source: unknown 13271 1727203833.12738: variable 'ansible_search_path' from source: unknown 13271 1727203833.12765: calling self._execute() 13271 1727203833.12833: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.12839: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.12847: variable 'omit' from source: magic vars 13271 1727203833.13104: variable 'ansible_distribution_major_version' from source: facts 13271 1727203833.13113: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203833.13118: variable 'omit' from source: magic vars 13271 1727203833.13157: variable 'omit' from source: magic vars 13271 1727203833.13185: variable 'omit' from source: magic vars 13271 1727203833.13216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203833.13248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203833.13262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203833.13279: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.13289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.13311: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203833.13314: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.13317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.13387: Set connection var ansible_connection to ssh 13271 1727203833.13393: Set connection var ansible_shell_type to sh 13271 1727203833.13400: Set connection var ansible_timeout to 10 13271 1727203833.13405: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203833.13410: Set connection var ansible_pipelining to False 13271 1727203833.13415: Set connection var ansible_shell_executable to /bin/sh 13271 1727203833.13432: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.13435: variable 'ansible_connection' from source: unknown 13271 1727203833.13438: variable 'ansible_module_compression' from source: unknown 13271 1727203833.13440: variable 'ansible_shell_type' from source: unknown 13271 1727203833.13443: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.13445: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.13448: variable 'ansible_pipelining' from source: unknown 13271 1727203833.13450: variable 'ansible_timeout' from source: unknown 13271 1727203833.13455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.13555: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203833.13562: variable 'omit' from source: magic vars 13271 1727203833.13569: starting attempt loop 13271 1727203833.13581: running the handler 13271 1727203833.13669: variable '__network_connections_result' from source: set_fact 13271 1727203833.13716: handler run complete 13271 1727203833.13729: attempt loop complete, returning result 13271 1727203833.13732: _execute() done 13271 1727203833.13735: dumping result to json 13271 1727203833.13739: done dumping result, returning 13271 1727203833.13748: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-2a40-12ba-000000000038] 13271 1727203833.13752: sending task result for task 028d2410-947f-2a40-12ba-000000000038 13271 1727203833.13832: done sending task result for task 028d2410-947f-2a40-12ba-000000000038 13271 1727203833.13835: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35 (not-active)" ] } 13271 1727203833.13898: no more pending results, returning what we have 13271 1727203833.13900: results queue empty 13271 1727203833.13901: checking for any_errors_fatal 13271 1727203833.13906: done checking for any_errors_fatal 13271 1727203833.13907: checking for max_fail_percentage 13271 1727203833.13909: done checking for max_fail_percentage 13271 1727203833.13909: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.13910: done checking to see if all hosts have failed 13271 1727203833.13911: getting the remaining hosts for this loop 13271 1727203833.13912: done getting the remaining hosts for this loop 13271 1727203833.13915: getting the next task for host managed-node1 13271 1727203833.13921: done getting next task for host managed-node1 13271 1727203833.13924: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13271 1727203833.13926: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.13935: getting variables 13271 1727203833.13937: in VariableManager get_vars() 13271 1727203833.13971: Calling all_inventory to load vars for managed-node1 13271 1727203833.13973: Calling groups_inventory to load vars for managed-node1 13271 1727203833.13977: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.13985: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.13987: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.13989: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.14827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.15689: done with get_vars() 13271 1727203833.15707: done getting variables 13271 1727203833.15750: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:50:33 -0400 (0:00:00.035) 0:00:16.801 ***** 13271 1727203833.15785: entering _queue_task() for managed-node1/debug 13271 1727203833.16025: worker is 1 (out of 1 available) 13271 1727203833.16038: exiting _queue_task() for managed-node1/debug 13271 1727203833.16049: done queuing things up, now waiting for results queue to drain 13271 1727203833.16051: waiting for pending results... 13271 1727203833.16226: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13271 1727203833.16297: in run() - task 028d2410-947f-2a40-12ba-000000000039 13271 1727203833.16309: variable 'ansible_search_path' from source: unknown 13271 1727203833.16313: variable 'ansible_search_path' from source: unknown 13271 1727203833.16341: calling self._execute() 13271 1727203833.16416: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.16420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.16429: variable 'omit' from source: magic vars 13271 1727203833.16694: variable 'ansible_distribution_major_version' from source: facts 13271 1727203833.16702: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203833.16708: variable 'omit' from source: magic vars 13271 1727203833.16747: variable 'omit' from source: magic vars 13271 1727203833.16772: variable 'omit' from source: magic vars 13271 1727203833.16806: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203833.16836: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203833.16852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203833.16881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.16884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.16899: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203833.16902: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.16906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.16974: Set connection var ansible_connection to ssh 13271 1727203833.16982: Set connection var ansible_shell_type to sh 13271 1727203833.16989: Set connection var ansible_timeout to 10 13271 1727203833.16994: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203833.16999: Set connection var ansible_pipelining to False 13271 1727203833.17004: Set connection var ansible_shell_executable to /bin/sh 13271 1727203833.17022: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.17025: variable 'ansible_connection' from source: unknown 13271 1727203833.17028: variable 'ansible_module_compression' from source: unknown 13271 1727203833.17033: variable 'ansible_shell_type' from source: unknown 13271 1727203833.17038: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.17041: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.17043: variable 'ansible_pipelining' from source: unknown 13271 1727203833.17045: variable 'ansible_timeout' from source: unknown 13271 1727203833.17047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.17146: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203833.17154: variable 'omit' from source: magic vars 13271 1727203833.17158: starting attempt loop 13271 1727203833.17164: running the handler 13271 1727203833.17203: variable '__network_connections_result' from source: set_fact 13271 1727203833.17255: variable '__network_connections_result' from source: set_fact 13271 1727203833.17363: handler run complete 13271 1727203833.17384: attempt loop complete, returning result 13271 1727203833.17388: _execute() done 13271 1727203833.17390: dumping result to json 13271 1727203833.17393: done dumping result, returning 13271 1727203833.17403: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-2a40-12ba-000000000039] 13271 1727203833.17406: sending task result for task 028d2410-947f-2a40-12ba-000000000039 13271 1727203833.17495: done sending task result for task 028d2410-947f-2a40-12ba-000000000039 13271 1727203833.17498: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 37efd94b-f2e9-48ef-ae23-b04b9f9540cf (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 7bd864cb-338e-4797-a338-3ce206f3a7c2 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, f89efd2b-3089-4f6f-a8eb-220a10d50c35 (not-active)" ] } } 13271 1727203833.17595: no more pending results, returning what we have 13271 1727203833.17598: results queue empty 13271 1727203833.17604: checking for any_errors_fatal 13271 1727203833.17609: done checking for any_errors_fatal 13271 1727203833.17611: checking for max_fail_percentage 13271 1727203833.17613: done checking for max_fail_percentage 13271 1727203833.17613: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.17615: done checking to see if all hosts have failed 13271 1727203833.17615: getting the remaining hosts for this loop 13271 1727203833.17616: done getting the remaining hosts for this loop 13271 1727203833.17619: getting the next task for host managed-node1 13271 1727203833.17625: done getting next task for host managed-node1 13271 1727203833.17628: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13271 1727203833.17630: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.17639: getting variables 13271 1727203833.17640: in VariableManager get_vars() 13271 1727203833.17674: Calling all_inventory to load vars for managed-node1 13271 1727203833.17683: Calling groups_inventory to load vars for managed-node1 13271 1727203833.17685: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.17694: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.17696: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.17699: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.18436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.19385: done with get_vars() 13271 1727203833.19400: done getting variables 13271 1727203833.19444: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:50:33 -0400 (0:00:00.036) 0:00:16.837 ***** 13271 1727203833.19470: entering _queue_task() for managed-node1/debug 13271 1727203833.19708: worker is 1 (out of 1 available) 13271 1727203833.19721: exiting _queue_task() for managed-node1/debug 13271 1727203833.19733: done queuing things up, now waiting for results queue to drain 13271 1727203833.19734: waiting for pending results... 13271 1727203833.19907: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13271 1727203833.19985: in run() - task 028d2410-947f-2a40-12ba-00000000003a 13271 1727203833.19998: variable 'ansible_search_path' from source: unknown 13271 1727203833.20001: variable 'ansible_search_path' from source: unknown 13271 1727203833.20029: calling self._execute() 13271 1727203833.20097: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.20101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.20109: variable 'omit' from source: magic vars 13271 1727203833.20368: variable 'ansible_distribution_major_version' from source: facts 13271 1727203833.20374: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203833.20456: variable 'network_state' from source: role '' defaults 13271 1727203833.20466: Evaluated conditional (network_state != {}): False 13271 1727203833.20469: when evaluation is False, skipping this task 13271 1727203833.20472: _execute() done 13271 1727203833.20474: dumping result to json 13271 1727203833.20478: done dumping result, returning 13271 1727203833.20484: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-2a40-12ba-00000000003a] 13271 1727203833.20490: sending task result for task 028d2410-947f-2a40-12ba-00000000003a 13271 1727203833.20569: done sending task result for task 028d2410-947f-2a40-12ba-00000000003a 13271 1727203833.20571: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 13271 1727203833.20652: no more pending results, returning what we have 13271 1727203833.20655: results queue empty 13271 1727203833.20656: checking for any_errors_fatal 13271 1727203833.20665: done checking for any_errors_fatal 13271 1727203833.20666: checking for max_fail_percentage 13271 1727203833.20667: done checking for max_fail_percentage 13271 1727203833.20668: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.20669: done checking to see if all hosts have failed 13271 1727203833.20670: getting the remaining hosts for this loop 13271 1727203833.20671: done getting the remaining hosts for this loop 13271 1727203833.20674: getting the next task for host managed-node1 13271 1727203833.20681: done getting next task for host managed-node1 13271 1727203833.20684: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13271 1727203833.20686: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.20698: getting variables 13271 1727203833.20700: in VariableManager get_vars() 13271 1727203833.20732: Calling all_inventory to load vars for managed-node1 13271 1727203833.20734: Calling groups_inventory to load vars for managed-node1 13271 1727203833.20736: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.20744: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.20746: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.20749: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.21469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.22332: done with get_vars() 13271 1727203833.22346: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:50:33 -0400 (0:00:00.029) 0:00:16.867 ***** 13271 1727203833.22416: entering _queue_task() for managed-node1/ping 13271 1727203833.22417: Creating lock for ping 13271 1727203833.22637: worker is 1 (out of 1 available) 13271 1727203833.22652: exiting _queue_task() for managed-node1/ping 13271 1727203833.22666: done queuing things up, now waiting for results queue to drain 13271 1727203833.22668: waiting for pending results... 13271 1727203833.22834: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13271 1727203833.22911: in run() - task 028d2410-947f-2a40-12ba-00000000003b 13271 1727203833.22923: variable 'ansible_search_path' from source: unknown 13271 1727203833.22926: variable 'ansible_search_path' from source: unknown 13271 1727203833.22952: calling self._execute() 13271 1727203833.23018: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.23023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.23032: variable 'omit' from source: magic vars 13271 1727203833.23284: variable 'ansible_distribution_major_version' from source: facts 13271 1727203833.23293: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203833.23298: variable 'omit' from source: magic vars 13271 1727203833.23340: variable 'omit' from source: magic vars 13271 1727203833.23366: variable 'omit' from source: magic vars 13271 1727203833.23396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203833.23421: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203833.23444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203833.23454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.23466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.23490: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203833.23493: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.23496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.23561: Set connection var ansible_connection to ssh 13271 1727203833.23567: Set connection var ansible_shell_type to sh 13271 1727203833.23574: Set connection var ansible_timeout to 10 13271 1727203833.23580: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203833.23585: Set connection var ansible_pipelining to False 13271 1727203833.23590: Set connection var ansible_shell_executable to /bin/sh 13271 1727203833.23608: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.23611: variable 'ansible_connection' from source: unknown 13271 1727203833.23614: variable 'ansible_module_compression' from source: unknown 13271 1727203833.23617: variable 'ansible_shell_type' from source: unknown 13271 1727203833.23619: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.23621: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.23623: variable 'ansible_pipelining' from source: unknown 13271 1727203833.23625: variable 'ansible_timeout' from source: unknown 13271 1727203833.23630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.23769: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203833.23785: variable 'omit' from source: magic vars 13271 1727203833.23788: starting attempt loop 13271 1727203833.23791: running the handler 13271 1727203833.23803: _low_level_execute_command(): starting 13271 1727203833.23810: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203833.24322: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.24327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.24330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.24387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.24390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.24396: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.24483: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.26264: stdout chunk (state=3): >>>/root <<< 13271 1727203833.26362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.26395: stderr chunk (state=3): >>><<< 13271 1727203833.26398: stdout chunk (state=3): >>><<< 13271 1727203833.26417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.26427: _low_level_execute_command(): starting 13271 1727203833.26433: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458 `" && echo ansible-tmp-1727203833.2641623-14520-100827884120458="` echo /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458 `" ) && sleep 0' 13271 1727203833.26870: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.26873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203833.26877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203833.26887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203833.26889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.26926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.26938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.27024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.29110: stdout chunk (state=3): >>>ansible-tmp-1727203833.2641623-14520-100827884120458=/root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458 <<< 13271 1727203833.29222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.29249: stderr chunk (state=3): >>><<< 13271 1727203833.29252: stdout chunk (state=3): >>><<< 13271 1727203833.29263: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203833.2641623-14520-100827884120458=/root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.29305: variable 'ansible_module_compression' from source: unknown 13271 1727203833.29338: ANSIBALLZ: Using lock for ping 13271 1727203833.29341: ANSIBALLZ: Acquiring lock 13271 1727203833.29343: ANSIBALLZ: Lock acquired: 140497830567600 13271 1727203833.29345: ANSIBALLZ: Creating module 13271 1727203833.37521: ANSIBALLZ: Writing module into payload 13271 1727203833.37561: ANSIBALLZ: Writing module 13271 1727203833.37582: ANSIBALLZ: Renaming module 13271 1727203833.37587: ANSIBALLZ: Done creating module 13271 1727203833.37602: variable 'ansible_facts' from source: unknown 13271 1727203833.37644: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/AnsiballZ_ping.py 13271 1727203833.37744: Sending initial data 13271 1727203833.37747: Sent initial data (153 bytes) 13271 1727203833.38168: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.38171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203833.38202: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203833.38205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203833.38208: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.38210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.38258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.38265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.38280: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.38360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.40119: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203833.40410: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203833.40502: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpa5kkhz1r /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/AnsiballZ_ping.py <<< 13271 1727203833.40506: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/AnsiballZ_ping.py" <<< 13271 1727203833.40580: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13271 1727203833.40589: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpa5kkhz1r" to remote "/root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/AnsiballZ_ping.py" <<< 13271 1727203833.40602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/AnsiballZ_ping.py" <<< 13271 1727203833.41464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.41682: stderr chunk (state=3): >>><<< 13271 1727203833.41685: stdout chunk (state=3): >>><<< 13271 1727203833.41687: done transferring module to remote 13271 1727203833.41689: _low_level_execute_command(): starting 13271 1727203833.41691: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/ /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/AnsiballZ_ping.py && sleep 0' 13271 1727203833.42184: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203833.42194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.42205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.42218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203833.42230: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203833.42242: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203833.42247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.42264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203833.42268: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203833.42277: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203833.42292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.42301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.42312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203833.42426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.42429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.42506: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.44518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.44521: stdout chunk (state=3): >>><<< 13271 1727203833.44524: stderr chunk (state=3): >>><<< 13271 1727203833.44541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.44622: _low_level_execute_command(): starting 13271 1727203833.44626: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/AnsiballZ_ping.py && sleep 0' 13271 1727203833.45147: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203833.45163: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.45182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.45201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203833.45296: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.45340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.45354: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.45741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.61928: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13271 1727203833.63468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203833.63493: stderr chunk (state=3): >>><<< 13271 1727203833.63496: stdout chunk (state=3): >>><<< 13271 1727203833.63512: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203833.63536: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203833.63544: _low_level_execute_command(): starting 13271 1727203833.63548: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203833.2641623-14520-100827884120458/ > /dev/null 2>&1 && sleep 0' 13271 1727203833.63993: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203833.63997: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.64004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203833.64006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.64050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.64053: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.64059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.64133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.66092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.66118: stderr chunk (state=3): >>><<< 13271 1727203833.66121: stdout chunk (state=3): >>><<< 13271 1727203833.66137: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.66142: handler run complete 13271 1727203833.66153: attempt loop complete, returning result 13271 1727203833.66156: _execute() done 13271 1727203833.66159: dumping result to json 13271 1727203833.66163: done dumping result, returning 13271 1727203833.66173: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-2a40-12ba-00000000003b] 13271 1727203833.66179: sending task result for task 028d2410-947f-2a40-12ba-00000000003b 13271 1727203833.66262: done sending task result for task 028d2410-947f-2a40-12ba-00000000003b 13271 1727203833.66265: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 13271 1727203833.66322: no more pending results, returning what we have 13271 1727203833.66325: results queue empty 13271 1727203833.66326: checking for any_errors_fatal 13271 1727203833.66331: done checking for any_errors_fatal 13271 1727203833.66331: checking for max_fail_percentage 13271 1727203833.66333: done checking for max_fail_percentage 13271 1727203833.66333: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.66334: done checking to see if all hosts have failed 13271 1727203833.66335: getting the remaining hosts for this loop 13271 1727203833.66336: done getting the remaining hosts for this loop 13271 1727203833.66339: getting the next task for host managed-node1 13271 1727203833.66347: done getting next task for host managed-node1 13271 1727203833.66349: ^ task is: TASK: meta (role_complete) 13271 1727203833.66352: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.66361: getting variables 13271 1727203833.66363: in VariableManager get_vars() 13271 1727203833.66406: Calling all_inventory to load vars for managed-node1 13271 1727203833.66409: Calling groups_inventory to load vars for managed-node1 13271 1727203833.66411: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.66421: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.66424: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.66426: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.67357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.68200: done with get_vars() 13271 1727203833.68219: done getting variables 13271 1727203833.68279: done queuing things up, now waiting for results queue to drain 13271 1727203833.68280: results queue empty 13271 1727203833.68281: checking for any_errors_fatal 13271 1727203833.68282: done checking for any_errors_fatal 13271 1727203833.68283: checking for max_fail_percentage 13271 1727203833.68284: done checking for max_fail_percentage 13271 1727203833.68284: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.68284: done checking to see if all hosts have failed 13271 1727203833.68285: getting the remaining hosts for this loop 13271 1727203833.68286: done getting the remaining hosts for this loop 13271 1727203833.68287: getting the next task for host managed-node1 13271 1727203833.68290: done getting next task for host managed-node1 13271 1727203833.68292: ^ task is: TASK: Include the task 'get_interface_stat.yml' 13271 1727203833.68293: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.68295: getting variables 13271 1727203833.68295: in VariableManager get_vars() 13271 1727203833.68305: Calling all_inventory to load vars for managed-node1 13271 1727203833.68306: Calling groups_inventory to load vars for managed-node1 13271 1727203833.68308: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.68311: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.68312: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.68314: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.68940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.69790: done with get_vars() 13271 1727203833.69804: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:50:33 -0400 (0:00:00.474) 0:00:17.341 ***** 13271 1727203833.69860: entering _queue_task() for managed-node1/include_tasks 13271 1727203833.70106: worker is 1 (out of 1 available) 13271 1727203833.70120: exiting _queue_task() for managed-node1/include_tasks 13271 1727203833.70132: done queuing things up, now waiting for results queue to drain 13271 1727203833.70134: waiting for pending results... 13271 1727203833.70315: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 13271 1727203833.70396: in run() - task 028d2410-947f-2a40-12ba-00000000006e 13271 1727203833.70407: variable 'ansible_search_path' from source: unknown 13271 1727203833.70411: variable 'ansible_search_path' from source: unknown 13271 1727203833.70438: calling self._execute() 13271 1727203833.70509: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.70513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.70522: variable 'omit' from source: magic vars 13271 1727203833.70787: variable 'ansible_distribution_major_version' from source: facts 13271 1727203833.70795: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203833.70801: _execute() done 13271 1727203833.70805: dumping result to json 13271 1727203833.70807: done dumping result, returning 13271 1727203833.70818: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [028d2410-947f-2a40-12ba-00000000006e] 13271 1727203833.70820: sending task result for task 028d2410-947f-2a40-12ba-00000000006e 13271 1727203833.70904: done sending task result for task 028d2410-947f-2a40-12ba-00000000006e 13271 1727203833.70907: WORKER PROCESS EXITING 13271 1727203833.70933: no more pending results, returning what we have 13271 1727203833.70937: in VariableManager get_vars() 13271 1727203833.70980: Calling all_inventory to load vars for managed-node1 13271 1727203833.70983: Calling groups_inventory to load vars for managed-node1 13271 1727203833.70985: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.70997: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.71000: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.71002: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.71848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.72683: done with get_vars() 13271 1727203833.72696: variable 'ansible_search_path' from source: unknown 13271 1727203833.72697: variable 'ansible_search_path' from source: unknown 13271 1727203833.72723: we have included files to process 13271 1727203833.72724: generating all_blocks data 13271 1727203833.72725: done generating all_blocks data 13271 1727203833.72728: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203833.72729: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203833.72730: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 13271 1727203833.72852: done processing included file 13271 1727203833.72854: iterating over new_blocks loaded from include file 13271 1727203833.72855: in VariableManager get_vars() 13271 1727203833.72868: done with get_vars() 13271 1727203833.72869: filtering new block on tags 13271 1727203833.72881: done filtering new block on tags 13271 1727203833.72882: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 13271 1727203833.72886: extending task lists for all hosts with included blocks 13271 1727203833.72945: done extending task lists 13271 1727203833.72946: done processing included files 13271 1727203833.72946: results queue empty 13271 1727203833.72947: checking for any_errors_fatal 13271 1727203833.72947: done checking for any_errors_fatal 13271 1727203833.72948: checking for max_fail_percentage 13271 1727203833.72949: done checking for max_fail_percentage 13271 1727203833.72949: checking to see if all hosts have failed and the running result is not ok 13271 1727203833.72950: done checking to see if all hosts have failed 13271 1727203833.72950: getting the remaining hosts for this loop 13271 1727203833.72951: done getting the remaining hosts for this loop 13271 1727203833.72952: getting the next task for host managed-node1 13271 1727203833.72955: done getting next task for host managed-node1 13271 1727203833.72956: ^ task is: TASK: Get stat for interface {{ interface }} 13271 1727203833.72958: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203833.72959: getting variables 13271 1727203833.72960: in VariableManager get_vars() 13271 1727203833.72969: Calling all_inventory to load vars for managed-node1 13271 1727203833.72971: Calling groups_inventory to load vars for managed-node1 13271 1727203833.72972: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203833.72977: Calling all_plugins_play to load vars for managed-node1 13271 1727203833.72979: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203833.72982: Calling groups_plugins_play to load vars for managed-node1 13271 1727203833.73609: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203833.74484: done with get_vars() 13271 1727203833.74498: done getting variables 13271 1727203833.74611: variable 'interface' from source: task vars 13271 1727203833.74613: variable 'controller_device' from source: play vars 13271 1727203833.74655: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:50:33 -0400 (0:00:00.048) 0:00:17.390 ***** 13271 1727203833.74682: entering _queue_task() for managed-node1/stat 13271 1727203833.74921: worker is 1 (out of 1 available) 13271 1727203833.74934: exiting _queue_task() for managed-node1/stat 13271 1727203833.74946: done queuing things up, now waiting for results queue to drain 13271 1727203833.74947: waiting for pending results... 13271 1727203833.75117: running TaskExecutor() for managed-node1/TASK: Get stat for interface nm-bond 13271 1727203833.75203: in run() - task 028d2410-947f-2a40-12ba-000000000241 13271 1727203833.75213: variable 'ansible_search_path' from source: unknown 13271 1727203833.75217: variable 'ansible_search_path' from source: unknown 13271 1727203833.75243: calling self._execute() 13271 1727203833.75315: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.75319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.75328: variable 'omit' from source: magic vars 13271 1727203833.75614: variable 'ansible_distribution_major_version' from source: facts 13271 1727203833.75617: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203833.75619: variable 'omit' from source: magic vars 13271 1727203833.75632: variable 'omit' from source: magic vars 13271 1727203833.75700: variable 'interface' from source: task vars 13271 1727203833.75704: variable 'controller_device' from source: play vars 13271 1727203833.75750: variable 'controller_device' from source: play vars 13271 1727203833.75766: variable 'omit' from source: magic vars 13271 1727203833.75799: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203833.75831: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203833.75845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203833.75859: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.75871: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203833.75896: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203833.75900: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.75902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.75969: Set connection var ansible_connection to ssh 13271 1727203833.75977: Set connection var ansible_shell_type to sh 13271 1727203833.75985: Set connection var ansible_timeout to 10 13271 1727203833.75990: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203833.75995: Set connection var ansible_pipelining to False 13271 1727203833.76000: Set connection var ansible_shell_executable to /bin/sh 13271 1727203833.76018: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.76021: variable 'ansible_connection' from source: unknown 13271 1727203833.76023: variable 'ansible_module_compression' from source: unknown 13271 1727203833.76025: variable 'ansible_shell_type' from source: unknown 13271 1727203833.76027: variable 'ansible_shell_executable' from source: unknown 13271 1727203833.76029: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203833.76033: variable 'ansible_pipelining' from source: unknown 13271 1727203833.76038: variable 'ansible_timeout' from source: unknown 13271 1727203833.76040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203833.76185: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203833.76193: variable 'omit' from source: magic vars 13271 1727203833.76198: starting attempt loop 13271 1727203833.76202: running the handler 13271 1727203833.76213: _low_level_execute_command(): starting 13271 1727203833.76220: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203833.76735: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203833.76740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.76743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.76796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.76799: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.76801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.76894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.78692: stdout chunk (state=3): >>>/root <<< 13271 1727203833.78792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.78819: stderr chunk (state=3): >>><<< 13271 1727203833.78822: stdout chunk (state=3): >>><<< 13271 1727203833.78843: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.78857: _low_level_execute_command(): starting 13271 1727203833.78861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593 `" && echo ansible-tmp-1727203833.788436-14544-39849269708593="` echo /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593 `" ) && sleep 0' 13271 1727203833.79294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.79297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.79309: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.79311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.79352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.79355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.79443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.81535: stdout chunk (state=3): >>>ansible-tmp-1727203833.788436-14544-39849269708593=/root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593 <<< 13271 1727203833.81646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.81673: stderr chunk (state=3): >>><<< 13271 1727203833.81682: stdout chunk (state=3): >>><<< 13271 1727203833.81696: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203833.788436-14544-39849269708593=/root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.81732: variable 'ansible_module_compression' from source: unknown 13271 1727203833.81774: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13271 1727203833.81806: variable 'ansible_facts' from source: unknown 13271 1727203833.81867: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/AnsiballZ_stat.py 13271 1727203833.81961: Sending initial data 13271 1727203833.81964: Sent initial data (151 bytes) 13271 1727203833.82407: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.82411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203833.82413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.82415: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.82417: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.82467: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.82473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.82553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.84312: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 13271 1727203833.84316: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203833.84389: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203833.84465: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp__9z0v4r /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/AnsiballZ_stat.py <<< 13271 1727203833.84468: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/AnsiballZ_stat.py" <<< 13271 1727203833.84535: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp__9z0v4r" to remote "/root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/AnsiballZ_stat.py" <<< 13271 1727203833.85246: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.85281: stderr chunk (state=3): >>><<< 13271 1727203833.85283: stdout chunk (state=3): >>><<< 13271 1727203833.85398: done transferring module to remote 13271 1727203833.85403: _low_level_execute_command(): starting 13271 1727203833.85406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/ /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/AnsiballZ_stat.py && sleep 0' 13271 1727203833.85925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203833.85940: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203833.85954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203833.85979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203833.86079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203833.86104: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.86217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203833.88172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203833.88196: stderr chunk (state=3): >>><<< 13271 1727203833.88199: stdout chunk (state=3): >>><<< 13271 1727203833.88213: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203833.88216: _low_level_execute_command(): starting 13271 1727203833.88221: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/AnsiballZ_stat.py && sleep 0' 13271 1727203833.88791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203833.88832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203833.88835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203833.88932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203834.05701: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27357, "dev": 23, "nlink": 1, "atime": 1727203832.8594952, "mtime": 1727203832.8594952, "ctime": 1727203832.8594952, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13271 1727203834.07483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203834.07487: stdout chunk (state=3): >>><<< 13271 1727203834.07489: stderr chunk (state=3): >>><<< 13271 1727203834.07492: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27357, "dev": 23, "nlink": 1, "atime": 1727203832.8594952, "mtime": 1727203832.8594952, "ctime": 1727203832.8594952, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203834.07495: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203834.07498: _low_level_execute_command(): starting 13271 1727203834.07500: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203833.788436-14544-39849269708593/ > /dev/null 2>&1 && sleep 0' 13271 1727203834.08414: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203834.08490: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203834.10981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203834.10985: stdout chunk (state=3): >>><<< 13271 1727203834.10988: stderr chunk (state=3): >>><<< 13271 1727203834.10990: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203834.10992: handler run complete 13271 1727203834.10995: attempt loop complete, returning result 13271 1727203834.10997: _execute() done 13271 1727203834.10998: dumping result to json 13271 1727203834.11000: done dumping result, returning 13271 1727203834.11002: done running TaskExecutor() for managed-node1/TASK: Get stat for interface nm-bond [028d2410-947f-2a40-12ba-000000000241] 13271 1727203834.11004: sending task result for task 028d2410-947f-2a40-12ba-000000000241 ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727203832.8594952, "block_size": 4096, "blocks": 0, "ctime": 1727203832.8594952, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27357, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727203832.8594952, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 13271 1727203834.11291: no more pending results, returning what we have 13271 1727203834.11295: results queue empty 13271 1727203834.11296: checking for any_errors_fatal 13271 1727203834.11297: done checking for any_errors_fatal 13271 1727203834.11298: checking for max_fail_percentage 13271 1727203834.11299: done checking for max_fail_percentage 13271 1727203834.11300: checking to see if all hosts have failed and the running result is not ok 13271 1727203834.11301: done checking to see if all hosts have failed 13271 1727203834.11302: getting the remaining hosts for this loop 13271 1727203834.11303: done getting the remaining hosts for this loop 13271 1727203834.11307: getting the next task for host managed-node1 13271 1727203834.11314: done getting next task for host managed-node1 13271 1727203834.11316: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 13271 1727203834.11319: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203834.11324: getting variables 13271 1727203834.11326: in VariableManager get_vars() 13271 1727203834.11364: Calling all_inventory to load vars for managed-node1 13271 1727203834.11366: Calling groups_inventory to load vars for managed-node1 13271 1727203834.11369: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203834.11493: Calling all_plugins_play to load vars for managed-node1 13271 1727203834.11496: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203834.11501: done sending task result for task 028d2410-947f-2a40-12ba-000000000241 13271 1727203834.11505: WORKER PROCESS EXITING 13271 1727203834.11509: Calling groups_plugins_play to load vars for managed-node1 13271 1727203834.14230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203834.17568: done with get_vars() 13271 1727203834.17748: done getting variables 13271 1727203834.17814: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203834.18102: variable 'interface' from source: task vars 13271 1727203834.18107: variable 'controller_device' from source: play vars 13271 1727203834.18243: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:50:34 -0400 (0:00:00.437) 0:00:17.827 ***** 13271 1727203834.18394: entering _queue_task() for managed-node1/assert 13271 1727203834.19157: worker is 1 (out of 1 available) 13271 1727203834.19168: exiting _queue_task() for managed-node1/assert 13271 1727203834.19182: done queuing things up, now waiting for results queue to drain 13271 1727203834.19184: waiting for pending results... 13271 1727203834.19648: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'nm-bond' 13271 1727203834.19746: in run() - task 028d2410-947f-2a40-12ba-00000000006f 13271 1727203834.19759: variable 'ansible_search_path' from source: unknown 13271 1727203834.19763: variable 'ansible_search_path' from source: unknown 13271 1727203834.20010: calling self._execute() 13271 1727203834.20094: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.20100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.20116: variable 'omit' from source: magic vars 13271 1727203834.20861: variable 'ansible_distribution_major_version' from source: facts 13271 1727203834.20869: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203834.20877: variable 'omit' from source: magic vars 13271 1727203834.20926: variable 'omit' from source: magic vars 13271 1727203834.21284: variable 'interface' from source: task vars 13271 1727203834.21289: variable 'controller_device' from source: play vars 13271 1727203834.21291: variable 'controller_device' from source: play vars 13271 1727203834.21306: variable 'omit' from source: magic vars 13271 1727203834.21345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203834.21392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203834.21408: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203834.21426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203834.21437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203834.21470: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203834.21474: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.21478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.21574: Set connection var ansible_connection to ssh 13271 1727203834.21749: Set connection var ansible_shell_type to sh 13271 1727203834.21828: Set connection var ansible_timeout to 10 13271 1727203834.21831: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203834.21834: Set connection var ansible_pipelining to False 13271 1727203834.21837: Set connection var ansible_shell_executable to /bin/sh 13271 1727203834.21839: variable 'ansible_shell_executable' from source: unknown 13271 1727203834.21841: variable 'ansible_connection' from source: unknown 13271 1727203834.21843: variable 'ansible_module_compression' from source: unknown 13271 1727203834.21845: variable 'ansible_shell_type' from source: unknown 13271 1727203834.21847: variable 'ansible_shell_executable' from source: unknown 13271 1727203834.21849: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.21851: variable 'ansible_pipelining' from source: unknown 13271 1727203834.21854: variable 'ansible_timeout' from source: unknown 13271 1727203834.21857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.21960: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203834.21973: variable 'omit' from source: magic vars 13271 1727203834.22049: starting attempt loop 13271 1727203834.22052: running the handler 13271 1727203834.22379: variable 'interface_stat' from source: set_fact 13271 1727203834.22485: Evaluated conditional (interface_stat.stat.exists): True 13271 1727203834.22488: handler run complete 13271 1727203834.22490: attempt loop complete, returning result 13271 1727203834.22492: _execute() done 13271 1727203834.22494: dumping result to json 13271 1727203834.22497: done dumping result, returning 13271 1727203834.22499: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'nm-bond' [028d2410-947f-2a40-12ba-00000000006f] 13271 1727203834.22501: sending task result for task 028d2410-947f-2a40-12ba-00000000006f 13271 1727203834.22565: done sending task result for task 028d2410-947f-2a40-12ba-00000000006f 13271 1727203834.22568: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203834.22631: no more pending results, returning what we have 13271 1727203834.22635: results queue empty 13271 1727203834.22636: checking for any_errors_fatal 13271 1727203834.22643: done checking for any_errors_fatal 13271 1727203834.22644: checking for max_fail_percentage 13271 1727203834.22646: done checking for max_fail_percentage 13271 1727203834.22647: checking to see if all hosts have failed and the running result is not ok 13271 1727203834.22648: done checking to see if all hosts have failed 13271 1727203834.22649: getting the remaining hosts for this loop 13271 1727203834.22650: done getting the remaining hosts for this loop 13271 1727203834.22655: getting the next task for host managed-node1 13271 1727203834.22663: done getting next task for host managed-node1 13271 1727203834.22667: ^ task is: TASK: Include the task 'assert_profile_present.yml' 13271 1727203834.22669: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203834.22673: getting variables 13271 1727203834.22674: in VariableManager get_vars() 13271 1727203834.22713: Calling all_inventory to load vars for managed-node1 13271 1727203834.22716: Calling groups_inventory to load vars for managed-node1 13271 1727203834.22718: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203834.22729: Calling all_plugins_play to load vars for managed-node1 13271 1727203834.22731: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203834.22733: Calling groups_plugins_play to load vars for managed-node1 13271 1727203834.25872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203834.29010: done with get_vars() 13271 1727203834.29151: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Tuesday 24 September 2024 14:50:34 -0400 (0:00:00.108) 0:00:17.935 ***** 13271 1727203834.29337: entering _queue_task() for managed-node1/include_tasks 13271 1727203834.29910: worker is 1 (out of 1 available) 13271 1727203834.29919: exiting _queue_task() for managed-node1/include_tasks 13271 1727203834.29929: done queuing things up, now waiting for results queue to drain 13271 1727203834.29931: waiting for pending results... 13271 1727203834.30172: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' 13271 1727203834.30181: in run() - task 028d2410-947f-2a40-12ba-000000000070 13271 1727203834.30185: variable 'ansible_search_path' from source: unknown 13271 1727203834.30230: variable 'controller_profile' from source: play vars 13271 1727203834.30429: variable 'controller_profile' from source: play vars 13271 1727203834.30449: variable 'port1_profile' from source: play vars 13271 1727203834.30572: variable 'port1_profile' from source: play vars 13271 1727203834.30595: variable 'port2_profile' from source: play vars 13271 1727203834.30662: variable 'port2_profile' from source: play vars 13271 1727203834.30701: variable 'omit' from source: magic vars 13271 1727203834.30884: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.30887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.30900: variable 'omit' from source: magic vars 13271 1727203834.31191: variable 'ansible_distribution_major_version' from source: facts 13271 1727203834.31205: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203834.31247: variable 'item' from source: unknown 13271 1727203834.31356: variable 'item' from source: unknown 13271 1727203834.31582: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.31585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.31588: variable 'omit' from source: magic vars 13271 1727203834.31708: variable 'ansible_distribution_major_version' from source: facts 13271 1727203834.31719: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203834.31750: variable 'item' from source: unknown 13271 1727203834.31817: variable 'item' from source: unknown 13271 1727203834.32010: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.32015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.32018: variable 'omit' from source: magic vars 13271 1727203834.32105: variable 'ansible_distribution_major_version' from source: facts 13271 1727203834.32225: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203834.32229: variable 'item' from source: unknown 13271 1727203834.32232: variable 'item' from source: unknown 13271 1727203834.32279: dumping result to json 13271 1727203834.32282: done dumping result, returning 13271 1727203834.32398: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' [028d2410-947f-2a40-12ba-000000000070] 13271 1727203834.32401: sending task result for task 028d2410-947f-2a40-12ba-000000000070 13271 1727203834.32447: done sending task result for task 028d2410-947f-2a40-12ba-000000000070 13271 1727203834.32450: WORKER PROCESS EXITING 13271 1727203834.32531: no more pending results, returning what we have 13271 1727203834.32536: in VariableManager get_vars() 13271 1727203834.32590: Calling all_inventory to load vars for managed-node1 13271 1727203834.32593: Calling groups_inventory to load vars for managed-node1 13271 1727203834.32596: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203834.32612: Calling all_plugins_play to load vars for managed-node1 13271 1727203834.32615: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203834.32619: Calling groups_plugins_play to load vars for managed-node1 13271 1727203834.34706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203834.37406: done with get_vars() 13271 1727203834.37432: variable 'ansible_search_path' from source: unknown 13271 1727203834.37457: variable 'ansible_search_path' from source: unknown 13271 1727203834.37466: variable 'ansible_search_path' from source: unknown 13271 1727203834.37473: we have included files to process 13271 1727203834.37474: generating all_blocks data 13271 1727203834.37477: done generating all_blocks data 13271 1727203834.37481: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.37482: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.37484: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.37686: in VariableManager get_vars() 13271 1727203834.37711: done with get_vars() 13271 1727203834.37961: done processing included file 13271 1727203834.37962: iterating over new_blocks loaded from include file 13271 1727203834.37964: in VariableManager get_vars() 13271 1727203834.37989: done with get_vars() 13271 1727203834.37991: filtering new block on tags 13271 1727203834.38011: done filtering new block on tags 13271 1727203834.38013: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0) 13271 1727203834.38019: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.38020: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.38023: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.38120: in VariableManager get_vars() 13271 1727203834.38139: done with get_vars() 13271 1727203834.38341: done processing included file 13271 1727203834.38342: iterating over new_blocks loaded from include file 13271 1727203834.38344: in VariableManager get_vars() 13271 1727203834.38358: done with get_vars() 13271 1727203834.38359: filtering new block on tags 13271 1727203834.38374: done filtering new block on tags 13271 1727203834.38378: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.0) 13271 1727203834.38381: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.38382: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.38385: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 13271 1727203834.38478: in VariableManager get_vars() 13271 1727203834.38564: done with get_vars() 13271 1727203834.38801: done processing included file 13271 1727203834.38803: iterating over new_blocks loaded from include file 13271 1727203834.38804: in VariableManager get_vars() 13271 1727203834.38822: done with get_vars() 13271 1727203834.38824: filtering new block on tags 13271 1727203834.38841: done filtering new block on tags 13271 1727203834.38843: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.1) 13271 1727203834.38846: extending task lists for all hosts with included blocks 13271 1727203834.42495: done extending task lists 13271 1727203834.42503: done processing included files 13271 1727203834.42504: results queue empty 13271 1727203834.42505: checking for any_errors_fatal 13271 1727203834.42508: done checking for any_errors_fatal 13271 1727203834.42509: checking for max_fail_percentage 13271 1727203834.42510: done checking for max_fail_percentage 13271 1727203834.42511: checking to see if all hosts have failed and the running result is not ok 13271 1727203834.42512: done checking to see if all hosts have failed 13271 1727203834.42512: getting the remaining hosts for this loop 13271 1727203834.42513: done getting the remaining hosts for this loop 13271 1727203834.42516: getting the next task for host managed-node1 13271 1727203834.42520: done getting next task for host managed-node1 13271 1727203834.42522: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13271 1727203834.42524: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203834.42526: getting variables 13271 1727203834.42527: in VariableManager get_vars() 13271 1727203834.42547: Calling all_inventory to load vars for managed-node1 13271 1727203834.42550: Calling groups_inventory to load vars for managed-node1 13271 1727203834.42552: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203834.42558: Calling all_plugins_play to load vars for managed-node1 13271 1727203834.42560: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203834.42563: Calling groups_plugins_play to load vars for managed-node1 13271 1727203834.47955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203834.49858: done with get_vars() 13271 1727203834.49990: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:50:34 -0400 (0:00:00.208) 0:00:18.143 ***** 13271 1727203834.50072: entering _queue_task() for managed-node1/include_tasks 13271 1727203834.50579: worker is 1 (out of 1 available) 13271 1727203834.50600: exiting _queue_task() for managed-node1/include_tasks 13271 1727203834.50612: done queuing things up, now waiting for results queue to drain 13271 1727203834.50614: waiting for pending results... 13271 1727203834.51270: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 13271 1727203834.51359: in run() - task 028d2410-947f-2a40-12ba-00000000025f 13271 1727203834.51391: variable 'ansible_search_path' from source: unknown 13271 1727203834.51395: variable 'ansible_search_path' from source: unknown 13271 1727203834.51578: calling self._execute() 13271 1727203834.51720: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.51727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.51736: variable 'omit' from source: magic vars 13271 1727203834.52304: variable 'ansible_distribution_major_version' from source: facts 13271 1727203834.52311: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203834.52318: _execute() done 13271 1727203834.52321: dumping result to json 13271 1727203834.52324: done dumping result, returning 13271 1727203834.52329: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-2a40-12ba-00000000025f] 13271 1727203834.52336: sending task result for task 028d2410-947f-2a40-12ba-00000000025f 13271 1727203834.52545: no more pending results, returning what we have 13271 1727203834.52550: in VariableManager get_vars() 13271 1727203834.52601: Calling all_inventory to load vars for managed-node1 13271 1727203834.52604: Calling groups_inventory to load vars for managed-node1 13271 1727203834.52606: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203834.52619: Calling all_plugins_play to load vars for managed-node1 13271 1727203834.52782: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203834.52788: Calling groups_plugins_play to load vars for managed-node1 13271 1727203834.53478: done sending task result for task 028d2410-947f-2a40-12ba-00000000025f 13271 1727203834.53482: WORKER PROCESS EXITING 13271 1727203834.55685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203834.57741: done with get_vars() 13271 1727203834.57759: variable 'ansible_search_path' from source: unknown 13271 1727203834.57761: variable 'ansible_search_path' from source: unknown 13271 1727203834.57801: we have included files to process 13271 1727203834.57802: generating all_blocks data 13271 1727203834.57804: done generating all_blocks data 13271 1727203834.57806: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203834.57807: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203834.57814: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203834.59233: done processing included file 13271 1727203834.59235: iterating over new_blocks loaded from include file 13271 1727203834.59237: in VariableManager get_vars() 13271 1727203834.59257: done with get_vars() 13271 1727203834.59258: filtering new block on tags 13271 1727203834.59287: done filtering new block on tags 13271 1727203834.59290: in VariableManager get_vars() 13271 1727203834.59309: done with get_vars() 13271 1727203834.59311: filtering new block on tags 13271 1727203834.59333: done filtering new block on tags 13271 1727203834.59336: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 13271 1727203834.59341: extending task lists for all hosts with included blocks 13271 1727203834.59569: done extending task lists 13271 1727203834.59570: done processing included files 13271 1727203834.59571: results queue empty 13271 1727203834.59572: checking for any_errors_fatal 13271 1727203834.59577: done checking for any_errors_fatal 13271 1727203834.59578: checking for max_fail_percentage 13271 1727203834.59579: done checking for max_fail_percentage 13271 1727203834.59579: checking to see if all hosts have failed and the running result is not ok 13271 1727203834.59580: done checking to see if all hosts have failed 13271 1727203834.59581: getting the remaining hosts for this loop 13271 1727203834.59582: done getting the remaining hosts for this loop 13271 1727203834.59584: getting the next task for host managed-node1 13271 1727203834.59588: done getting next task for host managed-node1 13271 1727203834.59594: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13271 1727203834.59597: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203834.59599: getting variables 13271 1727203834.59600: in VariableManager get_vars() 13271 1727203834.59613: Calling all_inventory to load vars for managed-node1 13271 1727203834.59615: Calling groups_inventory to load vars for managed-node1 13271 1727203834.59617: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203834.59622: Calling all_plugins_play to load vars for managed-node1 13271 1727203834.59624: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203834.59627: Calling groups_plugins_play to load vars for managed-node1 13271 1727203834.60765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203834.62329: done with get_vars() 13271 1727203834.62351: done getting variables 13271 1727203834.62396: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:50:34 -0400 (0:00:00.123) 0:00:18.267 ***** 13271 1727203834.62431: entering _queue_task() for managed-node1/set_fact 13271 1727203834.62796: worker is 1 (out of 1 available) 13271 1727203834.62810: exiting _queue_task() for managed-node1/set_fact 13271 1727203834.62821: done queuing things up, now waiting for results queue to drain 13271 1727203834.62823: waiting for pending results... 13271 1727203834.63078: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 13271 1727203834.63167: in run() - task 028d2410-947f-2a40-12ba-0000000003b0 13271 1727203834.63184: variable 'ansible_search_path' from source: unknown 13271 1727203834.63189: variable 'ansible_search_path' from source: unknown 13271 1727203834.63229: calling self._execute() 13271 1727203834.63481: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.63485: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.63489: variable 'omit' from source: magic vars 13271 1727203834.63728: variable 'ansible_distribution_major_version' from source: facts 13271 1727203834.63738: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203834.63745: variable 'omit' from source: magic vars 13271 1727203834.63797: variable 'omit' from source: magic vars 13271 1727203834.63837: variable 'omit' from source: magic vars 13271 1727203834.63880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203834.63916: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203834.63937: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203834.63955: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203834.63977: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203834.64008: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203834.64011: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.64014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.64118: Set connection var ansible_connection to ssh 13271 1727203834.64151: Set connection var ansible_shell_type to sh 13271 1727203834.64154: Set connection var ansible_timeout to 10 13271 1727203834.64156: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203834.64158: Set connection var ansible_pipelining to False 13271 1727203834.64160: Set connection var ansible_shell_executable to /bin/sh 13271 1727203834.64175: variable 'ansible_shell_executable' from source: unknown 13271 1727203834.64262: variable 'ansible_connection' from source: unknown 13271 1727203834.64266: variable 'ansible_module_compression' from source: unknown 13271 1727203834.64268: variable 'ansible_shell_type' from source: unknown 13271 1727203834.64271: variable 'ansible_shell_executable' from source: unknown 13271 1727203834.64273: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.64276: variable 'ansible_pipelining' from source: unknown 13271 1727203834.64279: variable 'ansible_timeout' from source: unknown 13271 1727203834.64281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.64344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203834.64354: variable 'omit' from source: magic vars 13271 1727203834.64360: starting attempt loop 13271 1727203834.64371: running the handler 13271 1727203834.64378: handler run complete 13271 1727203834.64389: attempt loop complete, returning result 13271 1727203834.64392: _execute() done 13271 1727203834.64395: dumping result to json 13271 1727203834.64403: done dumping result, returning 13271 1727203834.64409: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-2a40-12ba-0000000003b0] 13271 1727203834.64480: sending task result for task 028d2410-947f-2a40-12ba-0000000003b0 13271 1727203834.64538: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b0 13271 1727203834.64542: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13271 1727203834.64629: no more pending results, returning what we have 13271 1727203834.64631: results queue empty 13271 1727203834.64632: checking for any_errors_fatal 13271 1727203834.64634: done checking for any_errors_fatal 13271 1727203834.64635: checking for max_fail_percentage 13271 1727203834.64636: done checking for max_fail_percentage 13271 1727203834.64637: checking to see if all hosts have failed and the running result is not ok 13271 1727203834.64638: done checking to see if all hosts have failed 13271 1727203834.64639: getting the remaining hosts for this loop 13271 1727203834.64640: done getting the remaining hosts for this loop 13271 1727203834.64644: getting the next task for host managed-node1 13271 1727203834.64649: done getting next task for host managed-node1 13271 1727203834.64652: ^ task is: TASK: Stat profile file 13271 1727203834.64656: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203834.64659: getting variables 13271 1727203834.64660: in VariableManager get_vars() 13271 1727203834.64694: Calling all_inventory to load vars for managed-node1 13271 1727203834.64696: Calling groups_inventory to load vars for managed-node1 13271 1727203834.64698: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203834.64707: Calling all_plugins_play to load vars for managed-node1 13271 1727203834.64709: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203834.64712: Calling groups_plugins_play to load vars for managed-node1 13271 1727203834.66142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203834.67907: done with get_vars() 13271 1727203834.67941: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:50:34 -0400 (0:00:00.056) 0:00:18.323 ***** 13271 1727203834.68057: entering _queue_task() for managed-node1/stat 13271 1727203834.68480: worker is 1 (out of 1 available) 13271 1727203834.68492: exiting _queue_task() for managed-node1/stat 13271 1727203834.68504: done queuing things up, now waiting for results queue to drain 13271 1727203834.68506: waiting for pending results... 13271 1727203834.68902: running TaskExecutor() for managed-node1/TASK: Stat profile file 13271 1727203834.69078: in run() - task 028d2410-947f-2a40-12ba-0000000003b1 13271 1727203834.69083: variable 'ansible_search_path' from source: unknown 13271 1727203834.69086: variable 'ansible_search_path' from source: unknown 13271 1727203834.69088: calling self._execute() 13271 1727203834.69213: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.69227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.69255: variable 'omit' from source: magic vars 13271 1727203834.70010: variable 'ansible_distribution_major_version' from source: facts 13271 1727203834.70014: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203834.70017: variable 'omit' from source: magic vars 13271 1727203834.70019: variable 'omit' from source: magic vars 13271 1727203834.70102: variable 'profile' from source: include params 13271 1727203834.70121: variable 'item' from source: include params 13271 1727203834.70196: variable 'item' from source: include params 13271 1727203834.70229: variable 'omit' from source: magic vars 13271 1727203834.70278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203834.70319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203834.70352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203834.70374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203834.70392: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203834.70423: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203834.70432: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.70446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.70544: Set connection var ansible_connection to ssh 13271 1727203834.70584: Set connection var ansible_shell_type to sh 13271 1727203834.70587: Set connection var ansible_timeout to 10 13271 1727203834.70589: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203834.70593: Set connection var ansible_pipelining to False 13271 1727203834.70604: Set connection var ansible_shell_executable to /bin/sh 13271 1727203834.70631: variable 'ansible_shell_executable' from source: unknown 13271 1727203834.70637: variable 'ansible_connection' from source: unknown 13271 1727203834.70661: variable 'ansible_module_compression' from source: unknown 13271 1727203834.70664: variable 'ansible_shell_type' from source: unknown 13271 1727203834.70667: variable 'ansible_shell_executable' from source: unknown 13271 1727203834.70669: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203834.70671: variable 'ansible_pipelining' from source: unknown 13271 1727203834.70673: variable 'ansible_timeout' from source: unknown 13271 1727203834.70692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203834.70909: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203834.70914: variable 'omit' from source: magic vars 13271 1727203834.70923: starting attempt loop 13271 1727203834.70980: running the handler 13271 1727203834.70985: _low_level_execute_command(): starting 13271 1727203834.70987: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203834.71770: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203834.71826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203834.71842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203834.71866: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203834.71977: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203834.73788: stdout chunk (state=3): >>>/root <<< 13271 1727203834.73901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203834.73948: stderr chunk (state=3): >>><<< 13271 1727203834.73971: stdout chunk (state=3): >>><<< 13271 1727203834.74007: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203834.74113: _low_level_execute_command(): starting 13271 1727203834.74123: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981 `" && echo ansible-tmp-1727203834.7401137-14577-22999645124981="` echo /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981 `" ) && sleep 0' 13271 1727203834.74713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203834.74730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203834.74743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203834.74758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203834.74789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203834.74831: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203834.74882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203834.74896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203834.74927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203834.74955: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203834.74989: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203834.75087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203834.77197: stdout chunk (state=3): >>>ansible-tmp-1727203834.7401137-14577-22999645124981=/root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981 <<< 13271 1727203834.77351: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203834.77365: stderr chunk (state=3): >>><<< 13271 1727203834.77377: stdout chunk (state=3): >>><<< 13271 1727203834.77405: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203834.7401137-14577-22999645124981=/root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203834.77580: variable 'ansible_module_compression' from source: unknown 13271 1727203834.77583: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13271 1727203834.77586: variable 'ansible_facts' from source: unknown 13271 1727203834.77653: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/AnsiballZ_stat.py 13271 1727203834.77796: Sending initial data 13271 1727203834.77928: Sent initial data (152 bytes) 13271 1727203834.78484: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203834.78586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203834.78614: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203834.78631: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203834.78656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203834.78773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203834.80508: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203834.80670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203834.80763: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpvopk6fiw /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/AnsiballZ_stat.py <<< 13271 1727203834.80766: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/AnsiballZ_stat.py" <<< 13271 1727203834.80844: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpvopk6fiw" to remote "/root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/AnsiballZ_stat.py" <<< 13271 1727203834.81692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203834.81874: stderr chunk (state=3): >>><<< 13271 1727203834.81880: stdout chunk (state=3): >>><<< 13271 1727203834.81883: done transferring module to remote 13271 1727203834.81886: _low_level_execute_command(): starting 13271 1727203834.81888: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/ /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/AnsiballZ_stat.py && sleep 0' 13271 1727203834.82417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203834.82489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203834.82527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203834.82548: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203834.82583: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203834.82666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203834.84798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203834.84801: stdout chunk (state=3): >>><<< 13271 1727203834.84803: stderr chunk (state=3): >>><<< 13271 1727203834.84806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203834.84808: _low_level_execute_command(): starting 13271 1727203834.84810: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/AnsiballZ_stat.py && sleep 0' 13271 1727203834.85461: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203834.85526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203834.85541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203834.85579: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203834.85695: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.02419: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13271 1727203835.03893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203835.03897: stdout chunk (state=3): >>><<< 13271 1727203835.03907: stderr chunk (state=3): >>><<< 13271 1727203835.03928: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203835.03962: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203835.04014: _low_level_execute_command(): starting 13271 1727203835.04259: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203834.7401137-14577-22999645124981/ > /dev/null 2>&1 && sleep 0' 13271 1727203835.05330: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203835.05350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203835.05613: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203835.05658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203835.05733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.07785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203835.07800: stderr chunk (state=3): >>><<< 13271 1727203835.07810: stdout chunk (state=3): >>><<< 13271 1727203835.07849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203835.07927: handler run complete 13271 1727203835.07958: attempt loop complete, returning result 13271 1727203835.08051: _execute() done 13271 1727203835.08055: dumping result to json 13271 1727203835.08057: done dumping result, returning 13271 1727203835.08059: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-2a40-12ba-0000000003b1] 13271 1727203835.08061: sending task result for task 028d2410-947f-2a40-12ba-0000000003b1 13271 1727203835.08133: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b1 13271 1727203835.08136: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 13271 1727203835.08219: no more pending results, returning what we have 13271 1727203835.08222: results queue empty 13271 1727203835.08224: checking for any_errors_fatal 13271 1727203835.08230: done checking for any_errors_fatal 13271 1727203835.08231: checking for max_fail_percentage 13271 1727203835.08233: done checking for max_fail_percentage 13271 1727203835.08234: checking to see if all hosts have failed and the running result is not ok 13271 1727203835.08235: done checking to see if all hosts have failed 13271 1727203835.08236: getting the remaining hosts for this loop 13271 1727203835.08237: done getting the remaining hosts for this loop 13271 1727203835.08241: getting the next task for host managed-node1 13271 1727203835.08251: done getting next task for host managed-node1 13271 1727203835.08255: ^ task is: TASK: Set NM profile exist flag based on the profile files 13271 1727203835.08259: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203835.08263: getting variables 13271 1727203835.08265: in VariableManager get_vars() 13271 1727203835.08314: Calling all_inventory to load vars for managed-node1 13271 1727203835.08317: Calling groups_inventory to load vars for managed-node1 13271 1727203835.08320: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203835.08334: Calling all_plugins_play to load vars for managed-node1 13271 1727203835.08337: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203835.08340: Calling groups_plugins_play to load vars for managed-node1 13271 1727203835.10082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203835.12625: done with get_vars() 13271 1727203835.12659: done getting variables 13271 1727203835.12726: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:50:35 -0400 (0:00:00.446) 0:00:18.770 ***** 13271 1727203835.12758: entering _queue_task() for managed-node1/set_fact 13271 1727203835.13515: worker is 1 (out of 1 available) 13271 1727203835.13527: exiting _queue_task() for managed-node1/set_fact 13271 1727203835.13539: done queuing things up, now waiting for results queue to drain 13271 1727203835.13540: waiting for pending results... 13271 1727203835.14106: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 13271 1727203835.14182: in run() - task 028d2410-947f-2a40-12ba-0000000003b2 13271 1727203835.14186: variable 'ansible_search_path' from source: unknown 13271 1727203835.14188: variable 'ansible_search_path' from source: unknown 13271 1727203835.14411: calling self._execute() 13271 1727203835.14499: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.14502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.14526: variable 'omit' from source: magic vars 13271 1727203835.15287: variable 'ansible_distribution_major_version' from source: facts 13271 1727203835.15381: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203835.15616: variable 'profile_stat' from source: set_fact 13271 1727203835.15632: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203835.15635: when evaluation is False, skipping this task 13271 1727203835.15638: _execute() done 13271 1727203835.15640: dumping result to json 13271 1727203835.15643: done dumping result, returning 13271 1727203835.15652: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-2a40-12ba-0000000003b2] 13271 1727203835.15657: sending task result for task 028d2410-947f-2a40-12ba-0000000003b2 13271 1727203835.15881: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b2 13271 1727203835.15885: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203835.15937: no more pending results, returning what we have 13271 1727203835.15941: results queue empty 13271 1727203835.15943: checking for any_errors_fatal 13271 1727203835.15951: done checking for any_errors_fatal 13271 1727203835.15952: checking for max_fail_percentage 13271 1727203835.15954: done checking for max_fail_percentage 13271 1727203835.15955: checking to see if all hosts have failed and the running result is not ok 13271 1727203835.15956: done checking to see if all hosts have failed 13271 1727203835.15957: getting the remaining hosts for this loop 13271 1727203835.15958: done getting the remaining hosts for this loop 13271 1727203835.15962: getting the next task for host managed-node1 13271 1727203835.15970: done getting next task for host managed-node1 13271 1727203835.15973: ^ task is: TASK: Get NM profile info 13271 1727203835.15980: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203835.15984: getting variables 13271 1727203835.15986: in VariableManager get_vars() 13271 1727203835.16027: Calling all_inventory to load vars for managed-node1 13271 1727203835.16029: Calling groups_inventory to load vars for managed-node1 13271 1727203835.16031: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203835.16045: Calling all_plugins_play to load vars for managed-node1 13271 1727203835.16047: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203835.16050: Calling groups_plugins_play to load vars for managed-node1 13271 1727203835.19222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203835.22553: done with get_vars() 13271 1727203835.22709: done getting variables 13271 1727203835.22773: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:50:35 -0400 (0:00:00.101) 0:00:18.872 ***** 13271 1727203835.22887: entering _queue_task() for managed-node1/shell 13271 1727203835.23754: worker is 1 (out of 1 available) 13271 1727203835.23763: exiting _queue_task() for managed-node1/shell 13271 1727203835.23774: done queuing things up, now waiting for results queue to drain 13271 1727203835.23777: waiting for pending results... 13271 1727203835.24121: running TaskExecutor() for managed-node1/TASK: Get NM profile info 13271 1727203835.24418: in run() - task 028d2410-947f-2a40-12ba-0000000003b3 13271 1727203835.24422: variable 'ansible_search_path' from source: unknown 13271 1727203835.24424: variable 'ansible_search_path' from source: unknown 13271 1727203835.24510: calling self._execute() 13271 1727203835.24742: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.24746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.24756: variable 'omit' from source: magic vars 13271 1727203835.25608: variable 'ansible_distribution_major_version' from source: facts 13271 1727203835.25612: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203835.25614: variable 'omit' from source: magic vars 13271 1727203835.25705: variable 'omit' from source: magic vars 13271 1727203835.25929: variable 'profile' from source: include params 13271 1727203835.25940: variable 'item' from source: include params 13271 1727203835.26108: variable 'item' from source: include params 13271 1727203835.26120: variable 'omit' from source: magic vars 13271 1727203835.26434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203835.26437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203835.26440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203835.26442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203835.26444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203835.26446: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203835.26448: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.26450: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.26709: Set connection var ansible_connection to ssh 13271 1727203835.26721: Set connection var ansible_shell_type to sh 13271 1727203835.26773: Set connection var ansible_timeout to 10 13271 1727203835.26786: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203835.26795: Set connection var ansible_pipelining to False 13271 1727203835.26981: Set connection var ansible_shell_executable to /bin/sh 13271 1727203835.26985: variable 'ansible_shell_executable' from source: unknown 13271 1727203835.26987: variable 'ansible_connection' from source: unknown 13271 1727203835.26989: variable 'ansible_module_compression' from source: unknown 13271 1727203835.26991: variable 'ansible_shell_type' from source: unknown 13271 1727203835.26993: variable 'ansible_shell_executable' from source: unknown 13271 1727203835.26995: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.26997: variable 'ansible_pipelining' from source: unknown 13271 1727203835.26999: variable 'ansible_timeout' from source: unknown 13271 1727203835.27001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.27281: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203835.27284: variable 'omit' from source: magic vars 13271 1727203835.27286: starting attempt loop 13271 1727203835.27288: running the handler 13271 1727203835.27297: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203835.27325: _low_level_execute_command(): starting 13271 1727203835.27482: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203835.29122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203835.29135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203835.29138: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203835.29505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203835.29681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.31370: stdout chunk (state=3): >>>/root <<< 13271 1727203835.31524: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203835.31527: stdout chunk (state=3): >>><<< 13271 1727203835.31537: stderr chunk (state=3): >>><<< 13271 1727203835.31564: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203835.31580: _low_level_execute_command(): starting 13271 1727203835.31587: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896 `" && echo ansible-tmp-1727203835.3156254-14596-83270680902896="` echo /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896 `" ) && sleep 0' 13271 1727203835.32880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203835.32893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203835.33000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203835.33007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203835.33039: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203835.33045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203835.33094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203835.33228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203835.33271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203835.33396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.35526: stdout chunk (state=3): >>>ansible-tmp-1727203835.3156254-14596-83270680902896=/root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896 <<< 13271 1727203835.35629: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203835.35680: stderr chunk (state=3): >>><<< 13271 1727203835.35685: stdout chunk (state=3): >>><<< 13271 1727203835.35711: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203835.3156254-14596-83270680902896=/root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203835.35743: variable 'ansible_module_compression' from source: unknown 13271 1727203835.35894: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203835.36046: variable 'ansible_facts' from source: unknown 13271 1727203835.36194: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/AnsiballZ_command.py 13271 1727203835.36691: Sending initial data 13271 1727203835.36694: Sent initial data (155 bytes) 13271 1727203835.37841: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203835.37853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203835.37890: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203835.38054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203835.38081: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203835.38084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203835.38272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.40060: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203835.40137: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203835.40206: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpgnzs38tg /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/AnsiballZ_command.py <<< 13271 1727203835.40210: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/AnsiballZ_command.py" <<< 13271 1727203835.40347: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpgnzs38tg" to remote "/root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/AnsiballZ_command.py" <<< 13271 1727203835.41840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203835.41844: stdout chunk (state=3): >>><<< 13271 1727203835.41846: stderr chunk (state=3): >>><<< 13271 1727203835.41904: done transferring module to remote 13271 1727203835.41915: _low_level_execute_command(): starting 13271 1727203835.41920: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/ /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/AnsiballZ_command.py && sleep 0' 13271 1727203835.43405: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203835.43414: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203835.43582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203835.43691: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203835.43802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.45789: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203835.45990: stderr chunk (state=3): >>><<< 13271 1727203835.45993: stdout chunk (state=3): >>><<< 13271 1727203835.46011: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203835.46014: _low_level_execute_command(): starting 13271 1727203835.46019: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/AnsiballZ_command.py && sleep 0' 13271 1727203835.47234: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203835.47244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203835.47253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203835.47268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203835.47288: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203835.47482: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203835.47598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203835.47780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.71180: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:50:35.640558", "end": "2024-09-24 14:50:35.709941", "delta": "0:00:00.069383", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203835.73089: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203835.73093: stdout chunk (state=3): >>><<< 13271 1727203835.73096: stderr chunk (state=3): >>><<< 13271 1727203835.73098: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:50:35.640558", "end": "2024-09-24 14:50:35.709941", "delta": "0:00:00.069383", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203835.73101: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203835.73109: _low_level_execute_command(): starting 13271 1727203835.73111: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203835.3156254-14596-83270680902896/ > /dev/null 2>&1 && sleep 0' 13271 1727203835.73680: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203835.73687: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203835.73699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203835.73715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203835.73728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203835.73741: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203835.73744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203835.73759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203835.73766: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203835.73774: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203835.73792: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203835.73850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203835.73853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203835.73856: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203835.73858: stderr chunk (state=3): >>>debug2: match found <<< 13271 1727203835.73860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203835.73907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203835.73920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203835.73947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203835.74090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203835.76065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203835.76293: stderr chunk (state=3): >>><<< 13271 1727203835.76296: stdout chunk (state=3): >>><<< 13271 1727203835.76298: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203835.76300: handler run complete 13271 1727203835.76302: Evaluated conditional (False): False 13271 1727203835.76303: attempt loop complete, returning result 13271 1727203835.76305: _execute() done 13271 1727203835.76307: dumping result to json 13271 1727203835.76308: done dumping result, returning 13271 1727203835.76310: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-2a40-12ba-0000000003b3] 13271 1727203835.76312: sending task result for task 028d2410-947f-2a40-12ba-0000000003b3 13271 1727203835.76373: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b3 13271 1727203835.76377: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.069383", "end": "2024-09-24 14:50:35.709941", "rc": 0, "start": "2024-09-24 14:50:35.640558" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 13271 1727203835.76656: no more pending results, returning what we have 13271 1727203835.76660: results queue empty 13271 1727203835.76661: checking for any_errors_fatal 13271 1727203835.76665: done checking for any_errors_fatal 13271 1727203835.76666: checking for max_fail_percentage 13271 1727203835.76668: done checking for max_fail_percentage 13271 1727203835.76669: checking to see if all hosts have failed and the running result is not ok 13271 1727203835.76670: done checking to see if all hosts have failed 13271 1727203835.76670: getting the remaining hosts for this loop 13271 1727203835.76672: done getting the remaining hosts for this loop 13271 1727203835.76677: getting the next task for host managed-node1 13271 1727203835.76685: done getting next task for host managed-node1 13271 1727203835.76688: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13271 1727203835.76692: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203835.76696: getting variables 13271 1727203835.76698: in VariableManager get_vars() 13271 1727203835.76735: Calling all_inventory to load vars for managed-node1 13271 1727203835.76738: Calling groups_inventory to load vars for managed-node1 13271 1727203835.76741: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203835.76751: Calling all_plugins_play to load vars for managed-node1 13271 1727203835.76754: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203835.76757: Calling groups_plugins_play to load vars for managed-node1 13271 1727203835.78166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203835.79914: done with get_vars() 13271 1727203835.79942: done getting variables 13271 1727203835.80005: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:50:35 -0400 (0:00:00.571) 0:00:19.443 ***** 13271 1727203835.80041: entering _queue_task() for managed-node1/set_fact 13271 1727203835.80494: worker is 1 (out of 1 available) 13271 1727203835.80503: exiting _queue_task() for managed-node1/set_fact 13271 1727203835.80514: done queuing things up, now waiting for results queue to drain 13271 1727203835.80515: waiting for pending results... 13271 1727203835.80704: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13271 1727203835.80819: in run() - task 028d2410-947f-2a40-12ba-0000000003b4 13271 1727203835.80833: variable 'ansible_search_path' from source: unknown 13271 1727203835.80838: variable 'ansible_search_path' from source: unknown 13271 1727203835.80870: calling self._execute() 13271 1727203835.80967: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.80970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.80981: variable 'omit' from source: magic vars 13271 1727203835.81367: variable 'ansible_distribution_major_version' from source: facts 13271 1727203835.81379: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203835.81514: variable 'nm_profile_exists' from source: set_fact 13271 1727203835.81526: Evaluated conditional (nm_profile_exists.rc == 0): True 13271 1727203835.81532: variable 'omit' from source: magic vars 13271 1727203835.81623: variable 'omit' from source: magic vars 13271 1727203835.81627: variable 'omit' from source: magic vars 13271 1727203835.81655: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203835.81698: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203835.81718: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203835.81735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203835.81840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203835.81843: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203835.81846: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.81848: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.81887: Set connection var ansible_connection to ssh 13271 1727203835.81895: Set connection var ansible_shell_type to sh 13271 1727203835.81908: Set connection var ansible_timeout to 10 13271 1727203835.81913: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203835.81919: Set connection var ansible_pipelining to False 13271 1727203835.81924: Set connection var ansible_shell_executable to /bin/sh 13271 1727203835.81948: variable 'ansible_shell_executable' from source: unknown 13271 1727203835.81951: variable 'ansible_connection' from source: unknown 13271 1727203835.81953: variable 'ansible_module_compression' from source: unknown 13271 1727203835.81955: variable 'ansible_shell_type' from source: unknown 13271 1727203835.81958: variable 'ansible_shell_executable' from source: unknown 13271 1727203835.81960: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.81965: variable 'ansible_pipelining' from source: unknown 13271 1727203835.81967: variable 'ansible_timeout' from source: unknown 13271 1727203835.81970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.82108: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203835.82125: variable 'omit' from source: magic vars 13271 1727203835.82130: starting attempt loop 13271 1727203835.82133: running the handler 13271 1727203835.82145: handler run complete 13271 1727203835.82167: attempt loop complete, returning result 13271 1727203835.82170: _execute() done 13271 1727203835.82173: dumping result to json 13271 1727203835.82176: done dumping result, returning 13271 1727203835.82179: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-2a40-12ba-0000000003b4] 13271 1727203835.82181: sending task result for task 028d2410-947f-2a40-12ba-0000000003b4 13271 1727203835.82330: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b4 13271 1727203835.82334: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13271 1727203835.82390: no more pending results, returning what we have 13271 1727203835.82394: results queue empty 13271 1727203835.82395: checking for any_errors_fatal 13271 1727203835.82404: done checking for any_errors_fatal 13271 1727203835.82405: checking for max_fail_percentage 13271 1727203835.82406: done checking for max_fail_percentage 13271 1727203835.82407: checking to see if all hosts have failed and the running result is not ok 13271 1727203835.82408: done checking to see if all hosts have failed 13271 1727203835.82409: getting the remaining hosts for this loop 13271 1727203835.82411: done getting the remaining hosts for this loop 13271 1727203835.82414: getting the next task for host managed-node1 13271 1727203835.82425: done getting next task for host managed-node1 13271 1727203835.82427: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13271 1727203835.82431: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203835.82437: getting variables 13271 1727203835.82439: in VariableManager get_vars() 13271 1727203835.82478: Calling all_inventory to load vars for managed-node1 13271 1727203835.82481: Calling groups_inventory to load vars for managed-node1 13271 1727203835.82484: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203835.82495: Calling all_plugins_play to load vars for managed-node1 13271 1727203835.82498: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203835.82501: Calling groups_plugins_play to load vars for managed-node1 13271 1727203835.83923: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203835.85579: done with get_vars() 13271 1727203835.85607: done getting variables 13271 1727203835.85664: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203835.85789: variable 'profile' from source: include params 13271 1727203835.85793: variable 'item' from source: include params 13271 1727203835.85863: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:50:35 -0400 (0:00:00.058) 0:00:19.502 ***** 13271 1727203835.85905: entering _queue_task() for managed-node1/command 13271 1727203835.86234: worker is 1 (out of 1 available) 13271 1727203835.86247: exiting _queue_task() for managed-node1/command 13271 1727203835.86373: done queuing things up, now waiting for results queue to drain 13271 1727203835.86377: waiting for pending results... 13271 1727203835.86548: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 13271 1727203835.86684: in run() - task 028d2410-947f-2a40-12ba-0000000003b6 13271 1727203835.86689: variable 'ansible_search_path' from source: unknown 13271 1727203835.86691: variable 'ansible_search_path' from source: unknown 13271 1727203835.86694: calling self._execute() 13271 1727203835.86788: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.86981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.86986: variable 'omit' from source: magic vars 13271 1727203835.87191: variable 'ansible_distribution_major_version' from source: facts 13271 1727203835.87202: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203835.87323: variable 'profile_stat' from source: set_fact 13271 1727203835.87338: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203835.87341: when evaluation is False, skipping this task 13271 1727203835.87344: _execute() done 13271 1727203835.87346: dumping result to json 13271 1727203835.87357: done dumping result, returning 13271 1727203835.87366: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [028d2410-947f-2a40-12ba-0000000003b6] 13271 1727203835.87368: sending task result for task 028d2410-947f-2a40-12ba-0000000003b6 13271 1727203835.87450: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b6 13271 1727203835.87453: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203835.87513: no more pending results, returning what we have 13271 1727203835.87517: results queue empty 13271 1727203835.87518: checking for any_errors_fatal 13271 1727203835.87524: done checking for any_errors_fatal 13271 1727203835.87525: checking for max_fail_percentage 13271 1727203835.87527: done checking for max_fail_percentage 13271 1727203835.87528: checking to see if all hosts have failed and the running result is not ok 13271 1727203835.87529: done checking to see if all hosts have failed 13271 1727203835.87530: getting the remaining hosts for this loop 13271 1727203835.87531: done getting the remaining hosts for this loop 13271 1727203835.87535: getting the next task for host managed-node1 13271 1727203835.87544: done getting next task for host managed-node1 13271 1727203835.87546: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13271 1727203835.87550: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203835.87554: getting variables 13271 1727203835.87556: in VariableManager get_vars() 13271 1727203835.87812: Calling all_inventory to load vars for managed-node1 13271 1727203835.87815: Calling groups_inventory to load vars for managed-node1 13271 1727203835.87818: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203835.87829: Calling all_plugins_play to load vars for managed-node1 13271 1727203835.87832: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203835.87835: Calling groups_plugins_play to load vars for managed-node1 13271 1727203835.89352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203835.90890: done with get_vars() 13271 1727203835.90920: done getting variables 13271 1727203835.90986: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203835.91107: variable 'profile' from source: include params 13271 1727203835.91111: variable 'item' from source: include params 13271 1727203835.91177: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:50:35 -0400 (0:00:00.053) 0:00:19.555 ***** 13271 1727203835.91210: entering _queue_task() for managed-node1/set_fact 13271 1727203835.91557: worker is 1 (out of 1 available) 13271 1727203835.91683: exiting _queue_task() for managed-node1/set_fact 13271 1727203835.91696: done queuing things up, now waiting for results queue to drain 13271 1727203835.91698: waiting for pending results... 13271 1727203835.92094: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 13271 1727203835.92098: in run() - task 028d2410-947f-2a40-12ba-0000000003b7 13271 1727203835.92101: variable 'ansible_search_path' from source: unknown 13271 1727203835.92104: variable 'ansible_search_path' from source: unknown 13271 1727203835.92106: calling self._execute() 13271 1727203835.92143: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.92149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.92158: variable 'omit' from source: magic vars 13271 1727203835.92541: variable 'ansible_distribution_major_version' from source: facts 13271 1727203835.92545: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203835.92649: variable 'profile_stat' from source: set_fact 13271 1727203835.92667: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203835.92671: when evaluation is False, skipping this task 13271 1727203835.92673: _execute() done 13271 1727203835.92683: dumping result to json 13271 1727203835.92686: done dumping result, returning 13271 1727203835.92693: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [028d2410-947f-2a40-12ba-0000000003b7] 13271 1727203835.92699: sending task result for task 028d2410-947f-2a40-12ba-0000000003b7 13271 1727203835.92783: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b7 13271 1727203835.92788: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203835.92837: no more pending results, returning what we have 13271 1727203835.92842: results queue empty 13271 1727203835.92843: checking for any_errors_fatal 13271 1727203835.92849: done checking for any_errors_fatal 13271 1727203835.92850: checking for max_fail_percentage 13271 1727203835.92852: done checking for max_fail_percentage 13271 1727203835.92852: checking to see if all hosts have failed and the running result is not ok 13271 1727203835.92854: done checking to see if all hosts have failed 13271 1727203835.92854: getting the remaining hosts for this loop 13271 1727203835.92856: done getting the remaining hosts for this loop 13271 1727203835.92859: getting the next task for host managed-node1 13271 1727203835.92867: done getting next task for host managed-node1 13271 1727203835.92870: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13271 1727203835.92986: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203835.92991: getting variables 13271 1727203835.92992: in VariableManager get_vars() 13271 1727203835.93035: Calling all_inventory to load vars for managed-node1 13271 1727203835.93038: Calling groups_inventory to load vars for managed-node1 13271 1727203835.93040: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203835.93053: Calling all_plugins_play to load vars for managed-node1 13271 1727203835.93057: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203835.93060: Calling groups_plugins_play to load vars for managed-node1 13271 1727203835.94493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203835.96173: done with get_vars() 13271 1727203835.96201: done getting variables 13271 1727203835.96266: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203835.96390: variable 'profile' from source: include params 13271 1727203835.96394: variable 'item' from source: include params 13271 1727203835.96458: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:50:35 -0400 (0:00:00.052) 0:00:19.608 ***** 13271 1727203835.96491: entering _queue_task() for managed-node1/command 13271 1727203835.97101: worker is 1 (out of 1 available) 13271 1727203835.97114: exiting _queue_task() for managed-node1/command 13271 1727203835.97127: done queuing things up, now waiting for results queue to drain 13271 1727203835.97128: waiting for pending results... 13271 1727203835.97894: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 13271 1727203835.97931: in run() - task 028d2410-947f-2a40-12ba-0000000003b8 13271 1727203835.97961: variable 'ansible_search_path' from source: unknown 13271 1727203835.97967: variable 'ansible_search_path' from source: unknown 13271 1727203835.98208: calling self._execute() 13271 1727203835.98421: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203835.98432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203835.98436: variable 'omit' from source: magic vars 13271 1727203835.98793: variable 'ansible_distribution_major_version' from source: facts 13271 1727203835.98808: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203835.98938: variable 'profile_stat' from source: set_fact 13271 1727203835.98956: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203835.98970: when evaluation is False, skipping this task 13271 1727203835.98982: _execute() done 13271 1727203835.98990: dumping result to json 13271 1727203835.98996: done dumping result, returning 13271 1727203835.99080: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 [028d2410-947f-2a40-12ba-0000000003b8] 13271 1727203835.99084: sending task result for task 028d2410-947f-2a40-12ba-0000000003b8 13271 1727203835.99146: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b8 13271 1727203835.99149: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203835.99209: no more pending results, returning what we have 13271 1727203835.99213: results queue empty 13271 1727203835.99214: checking for any_errors_fatal 13271 1727203835.99220: done checking for any_errors_fatal 13271 1727203835.99221: checking for max_fail_percentage 13271 1727203835.99223: done checking for max_fail_percentage 13271 1727203835.99223: checking to see if all hosts have failed and the running result is not ok 13271 1727203835.99224: done checking to see if all hosts have failed 13271 1727203835.99225: getting the remaining hosts for this loop 13271 1727203835.99226: done getting the remaining hosts for this loop 13271 1727203835.99230: getting the next task for host managed-node1 13271 1727203835.99237: done getting next task for host managed-node1 13271 1727203835.99239: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13271 1727203835.99244: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203835.99249: getting variables 13271 1727203835.99250: in VariableManager get_vars() 13271 1727203835.99298: Calling all_inventory to load vars for managed-node1 13271 1727203835.99301: Calling groups_inventory to load vars for managed-node1 13271 1727203835.99303: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203835.99317: Calling all_plugins_play to load vars for managed-node1 13271 1727203835.99320: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203835.99323: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.02778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.06047: done with get_vars() 13271 1727203836.06082: done getting variables 13271 1727203836.06143: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203836.06458: variable 'profile' from source: include params 13271 1727203836.06465: variable 'item' from source: include params 13271 1727203836.06524: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:50:36 -0400 (0:00:00.100) 0:00:19.708 ***** 13271 1727203836.06556: entering _queue_task() for managed-node1/set_fact 13271 1727203836.07217: worker is 1 (out of 1 available) 13271 1727203836.07230: exiting _queue_task() for managed-node1/set_fact 13271 1727203836.07241: done queuing things up, now waiting for results queue to drain 13271 1727203836.07243: waiting for pending results... 13271 1727203836.07980: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 13271 1727203836.07987: in run() - task 028d2410-947f-2a40-12ba-0000000003b9 13271 1727203836.07991: variable 'ansible_search_path' from source: unknown 13271 1727203836.07994: variable 'ansible_search_path' from source: unknown 13271 1727203836.08108: calling self._execute() 13271 1727203836.08226: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.08346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.08358: variable 'omit' from source: magic vars 13271 1727203836.09379: variable 'ansible_distribution_major_version' from source: facts 13271 1727203836.09392: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203836.09844: variable 'profile_stat' from source: set_fact 13271 1727203836.09859: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203836.09865: when evaluation is False, skipping this task 13271 1727203836.09868: _execute() done 13271 1727203836.09870: dumping result to json 13271 1727203836.09873: done dumping result, returning 13271 1727203836.09877: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [028d2410-947f-2a40-12ba-0000000003b9] 13271 1727203836.09882: sending task result for task 028d2410-947f-2a40-12ba-0000000003b9 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203836.10078: no more pending results, returning what we have 13271 1727203836.10083: results queue empty 13271 1727203836.10084: checking for any_errors_fatal 13271 1727203836.10090: done checking for any_errors_fatal 13271 1727203836.10091: checking for max_fail_percentage 13271 1727203836.10093: done checking for max_fail_percentage 13271 1727203836.10093: checking to see if all hosts have failed and the running result is not ok 13271 1727203836.10095: done checking to see if all hosts have failed 13271 1727203836.10095: getting the remaining hosts for this loop 13271 1727203836.10097: done getting the remaining hosts for this loop 13271 1727203836.10101: getting the next task for host managed-node1 13271 1727203836.10110: done getting next task for host managed-node1 13271 1727203836.10113: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13271 1727203836.10117: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203836.10122: getting variables 13271 1727203836.10124: in VariableManager get_vars() 13271 1727203836.10170: Calling all_inventory to load vars for managed-node1 13271 1727203836.10172: Calling groups_inventory to load vars for managed-node1 13271 1727203836.10177: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203836.10190: Calling all_plugins_play to load vars for managed-node1 13271 1727203836.10192: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203836.10195: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.10882: done sending task result for task 028d2410-947f-2a40-12ba-0000000003b9 13271 1727203836.10885: WORKER PROCESS EXITING 13271 1727203836.12934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.16856: done with get_vars() 13271 1727203836.16995: done getting variables 13271 1727203836.17066: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203836.17449: variable 'profile' from source: include params 13271 1727203836.17453: variable 'item' from source: include params 13271 1727203836.17519: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:50:36 -0400 (0:00:00.110) 0:00:19.819 ***** 13271 1727203836.17604: entering _queue_task() for managed-node1/assert 13271 1727203836.18289: worker is 1 (out of 1 available) 13271 1727203836.18304: exiting _queue_task() for managed-node1/assert 13271 1727203836.18317: done queuing things up, now waiting for results queue to drain 13271 1727203836.18319: waiting for pending results... 13271 1727203836.18767: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' 13271 1727203836.19037: in run() - task 028d2410-947f-2a40-12ba-000000000260 13271 1727203836.19152: variable 'ansible_search_path' from source: unknown 13271 1727203836.19479: variable 'ansible_search_path' from source: unknown 13271 1727203836.19485: calling self._execute() 13271 1727203836.19488: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.19491: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.19494: variable 'omit' from source: magic vars 13271 1727203836.20184: variable 'ansible_distribution_major_version' from source: facts 13271 1727203836.20195: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203836.20202: variable 'omit' from source: magic vars 13271 1727203836.20238: variable 'omit' from source: magic vars 13271 1727203836.20444: variable 'profile' from source: include params 13271 1727203836.20448: variable 'item' from source: include params 13271 1727203836.20628: variable 'item' from source: include params 13271 1727203836.20646: variable 'omit' from source: magic vars 13271 1727203836.20769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203836.20895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203836.20921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203836.20940: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.20952: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.21084: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203836.21089: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.21092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.21210: Set connection var ansible_connection to ssh 13271 1727203836.21217: Set connection var ansible_shell_type to sh 13271 1727203836.21225: Set connection var ansible_timeout to 10 13271 1727203836.21355: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203836.21365: Set connection var ansible_pipelining to False 13271 1727203836.21368: Set connection var ansible_shell_executable to /bin/sh 13271 1727203836.21560: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.21566: variable 'ansible_connection' from source: unknown 13271 1727203836.21568: variable 'ansible_module_compression' from source: unknown 13271 1727203836.21571: variable 'ansible_shell_type' from source: unknown 13271 1727203836.21573: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.21577: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.21579: variable 'ansible_pipelining' from source: unknown 13271 1727203836.21582: variable 'ansible_timeout' from source: unknown 13271 1727203836.21584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.21742: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203836.21752: variable 'omit' from source: magic vars 13271 1727203836.21758: starting attempt loop 13271 1727203836.21760: running the handler 13271 1727203836.22064: variable 'lsr_net_profile_exists' from source: set_fact 13271 1727203836.22068: Evaluated conditional (lsr_net_profile_exists): True 13271 1727203836.22073: handler run complete 13271 1727203836.22091: attempt loop complete, returning result 13271 1727203836.22099: _execute() done 13271 1727203836.22102: dumping result to json 13271 1727203836.22105: done dumping result, returning 13271 1727203836.22107: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' [028d2410-947f-2a40-12ba-000000000260] 13271 1727203836.22109: sending task result for task 028d2410-947f-2a40-12ba-000000000260 13271 1727203836.22199: done sending task result for task 028d2410-947f-2a40-12ba-000000000260 13271 1727203836.22204: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203836.22252: no more pending results, returning what we have 13271 1727203836.22256: results queue empty 13271 1727203836.22258: checking for any_errors_fatal 13271 1727203836.22267: done checking for any_errors_fatal 13271 1727203836.22268: checking for max_fail_percentage 13271 1727203836.22270: done checking for max_fail_percentage 13271 1727203836.22271: checking to see if all hosts have failed and the running result is not ok 13271 1727203836.22273: done checking to see if all hosts have failed 13271 1727203836.22273: getting the remaining hosts for this loop 13271 1727203836.22275: done getting the remaining hosts for this loop 13271 1727203836.22280: getting the next task for host managed-node1 13271 1727203836.22288: done getting next task for host managed-node1 13271 1727203836.22291: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13271 1727203836.22294: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203836.22299: getting variables 13271 1727203836.22300: in VariableManager get_vars() 13271 1727203836.22339: Calling all_inventory to load vars for managed-node1 13271 1727203836.22341: Calling groups_inventory to load vars for managed-node1 13271 1727203836.22343: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203836.22354: Calling all_plugins_play to load vars for managed-node1 13271 1727203836.22357: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203836.22359: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.25107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.26797: done with get_vars() 13271 1727203836.26821: done getting variables 13271 1727203836.26894: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203836.27027: variable 'profile' from source: include params 13271 1727203836.27034: variable 'item' from source: include params 13271 1727203836.27095: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:50:36 -0400 (0:00:00.095) 0:00:19.914 ***** 13271 1727203836.27169: entering _queue_task() for managed-node1/assert 13271 1727203836.27565: worker is 1 (out of 1 available) 13271 1727203836.27980: exiting _queue_task() for managed-node1/assert 13271 1727203836.27989: done queuing things up, now waiting for results queue to drain 13271 1727203836.27991: waiting for pending results... 13271 1727203836.28396: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' 13271 1727203836.28401: in run() - task 028d2410-947f-2a40-12ba-000000000261 13271 1727203836.28405: variable 'ansible_search_path' from source: unknown 13271 1727203836.28408: variable 'ansible_search_path' from source: unknown 13271 1727203836.28634: calling self._execute() 13271 1727203836.28638: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.28641: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.28645: variable 'omit' from source: magic vars 13271 1727203836.28992: variable 'ansible_distribution_major_version' from source: facts 13271 1727203836.29001: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203836.29009: variable 'omit' from source: magic vars 13271 1727203836.29125: variable 'omit' from source: magic vars 13271 1727203836.29148: variable 'profile' from source: include params 13271 1727203836.29152: variable 'item' from source: include params 13271 1727203836.29214: variable 'item' from source: include params 13271 1727203836.29234: variable 'omit' from source: magic vars 13271 1727203836.29279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203836.29318: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203836.29337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203836.29353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.29377: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.29780: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203836.29784: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.29786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.29789: Set connection var ansible_connection to ssh 13271 1727203836.29791: Set connection var ansible_shell_type to sh 13271 1727203836.29794: Set connection var ansible_timeout to 10 13271 1727203836.29797: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203836.29800: Set connection var ansible_pipelining to False 13271 1727203836.29803: Set connection var ansible_shell_executable to /bin/sh 13271 1727203836.29805: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.29808: variable 'ansible_connection' from source: unknown 13271 1727203836.29811: variable 'ansible_module_compression' from source: unknown 13271 1727203836.29813: variable 'ansible_shell_type' from source: unknown 13271 1727203836.29816: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.29819: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.29821: variable 'ansible_pipelining' from source: unknown 13271 1727203836.29823: variable 'ansible_timeout' from source: unknown 13271 1727203836.29828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.29849: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203836.29852: variable 'omit' from source: magic vars 13271 1727203836.29855: starting attempt loop 13271 1727203836.29858: running the handler 13271 1727203836.29905: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13271 1727203836.29909: Evaluated conditional (lsr_net_profile_ansible_managed): True 13271 1727203836.29916: handler run complete 13271 1727203836.29935: attempt loop complete, returning result 13271 1727203836.29938: _execute() done 13271 1727203836.29946: dumping result to json 13271 1727203836.29949: done dumping result, returning 13271 1727203836.29958: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' [028d2410-947f-2a40-12ba-000000000261] 13271 1727203836.29963: sending task result for task 028d2410-947f-2a40-12ba-000000000261 13271 1727203836.30044: done sending task result for task 028d2410-947f-2a40-12ba-000000000261 13271 1727203836.30047: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203836.30101: no more pending results, returning what we have 13271 1727203836.30104: results queue empty 13271 1727203836.30106: checking for any_errors_fatal 13271 1727203836.30112: done checking for any_errors_fatal 13271 1727203836.30113: checking for max_fail_percentage 13271 1727203836.30114: done checking for max_fail_percentage 13271 1727203836.30115: checking to see if all hosts have failed and the running result is not ok 13271 1727203836.30116: done checking to see if all hosts have failed 13271 1727203836.30117: getting the remaining hosts for this loop 13271 1727203836.30118: done getting the remaining hosts for this loop 13271 1727203836.30121: getting the next task for host managed-node1 13271 1727203836.30128: done getting next task for host managed-node1 13271 1727203836.30131: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13271 1727203836.30134: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203836.30139: getting variables 13271 1727203836.30140: in VariableManager get_vars() 13271 1727203836.30413: Calling all_inventory to load vars for managed-node1 13271 1727203836.30415: Calling groups_inventory to load vars for managed-node1 13271 1727203836.30418: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203836.30428: Calling all_plugins_play to load vars for managed-node1 13271 1727203836.30430: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203836.30433: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.31972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.33607: done with get_vars() 13271 1727203836.33633: done getting variables 13271 1727203836.33698: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203836.33815: variable 'profile' from source: include params 13271 1727203836.33819: variable 'item' from source: include params 13271 1727203836.33885: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:50:36 -0400 (0:00:00.067) 0:00:19.982 ***** 13271 1727203836.33921: entering _queue_task() for managed-node1/assert 13271 1727203836.34649: worker is 1 (out of 1 available) 13271 1727203836.34666: exiting _queue_task() for managed-node1/assert 13271 1727203836.34884: done queuing things up, now waiting for results queue to drain 13271 1727203836.34887: waiting for pending results... 13271 1727203836.35322: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 13271 1727203836.35479: in run() - task 028d2410-947f-2a40-12ba-000000000262 13271 1727203836.35491: variable 'ansible_search_path' from source: unknown 13271 1727203836.35495: variable 'ansible_search_path' from source: unknown 13271 1727203836.35569: calling self._execute() 13271 1727203836.35699: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.35705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.35716: variable 'omit' from source: magic vars 13271 1727203836.36263: variable 'ansible_distribution_major_version' from source: facts 13271 1727203836.36271: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203836.36278: variable 'omit' from source: magic vars 13271 1727203836.36320: variable 'omit' from source: magic vars 13271 1727203836.36424: variable 'profile' from source: include params 13271 1727203836.36427: variable 'item' from source: include params 13271 1727203836.36484: variable 'item' from source: include params 13271 1727203836.36781: variable 'omit' from source: magic vars 13271 1727203836.36784: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203836.36788: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203836.36790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203836.36793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.36795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.36797: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203836.36799: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.36801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.36803: Set connection var ansible_connection to ssh 13271 1727203836.36805: Set connection var ansible_shell_type to sh 13271 1727203836.36806: Set connection var ansible_timeout to 10 13271 1727203836.36808: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203836.36810: Set connection var ansible_pipelining to False 13271 1727203836.36813: Set connection var ansible_shell_executable to /bin/sh 13271 1727203836.36815: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.36817: variable 'ansible_connection' from source: unknown 13271 1727203836.36819: variable 'ansible_module_compression' from source: unknown 13271 1727203836.36821: variable 'ansible_shell_type' from source: unknown 13271 1727203836.36823: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.36825: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.36830: variable 'ansible_pipelining' from source: unknown 13271 1727203836.36832: variable 'ansible_timeout' from source: unknown 13271 1727203836.36842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.36979: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203836.36990: variable 'omit' from source: magic vars 13271 1727203836.36996: starting attempt loop 13271 1727203836.37000: running the handler 13271 1727203836.37116: variable 'lsr_net_profile_fingerprint' from source: set_fact 13271 1727203836.37120: Evaluated conditional (lsr_net_profile_fingerprint): True 13271 1727203836.37128: handler run complete 13271 1727203836.37143: attempt loop complete, returning result 13271 1727203836.37146: _execute() done 13271 1727203836.37149: dumping result to json 13271 1727203836.37152: done dumping result, returning 13271 1727203836.37160: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 [028d2410-947f-2a40-12ba-000000000262] 13271 1727203836.37173: sending task result for task 028d2410-947f-2a40-12ba-000000000262 13271 1727203836.37253: done sending task result for task 028d2410-947f-2a40-12ba-000000000262 13271 1727203836.37256: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203836.37328: no more pending results, returning what we have 13271 1727203836.37339: results queue empty 13271 1727203836.37340: checking for any_errors_fatal 13271 1727203836.37352: done checking for any_errors_fatal 13271 1727203836.37353: checking for max_fail_percentage 13271 1727203836.37355: done checking for max_fail_percentage 13271 1727203836.37356: checking to see if all hosts have failed and the running result is not ok 13271 1727203836.37357: done checking to see if all hosts have failed 13271 1727203836.37358: getting the remaining hosts for this loop 13271 1727203836.37359: done getting the remaining hosts for this loop 13271 1727203836.37365: getting the next task for host managed-node1 13271 1727203836.37374: done getting next task for host managed-node1 13271 1727203836.37379: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13271 1727203836.37382: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203836.37386: getting variables 13271 1727203836.37388: in VariableManager get_vars() 13271 1727203836.37428: Calling all_inventory to load vars for managed-node1 13271 1727203836.37431: Calling groups_inventory to load vars for managed-node1 13271 1727203836.37433: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203836.37445: Calling all_plugins_play to load vars for managed-node1 13271 1727203836.37448: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203836.37450: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.39792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.41394: done with get_vars() 13271 1727203836.41435: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:50:36 -0400 (0:00:00.077) 0:00:20.060 ***** 13271 1727203836.41724: entering _queue_task() for managed-node1/include_tasks 13271 1727203836.42316: worker is 1 (out of 1 available) 13271 1727203836.42330: exiting _queue_task() for managed-node1/include_tasks 13271 1727203836.42342: done queuing things up, now waiting for results queue to drain 13271 1727203836.42344: waiting for pending results... 13271 1727203836.42904: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 13271 1727203836.43070: in run() - task 028d2410-947f-2a40-12ba-000000000266 13271 1727203836.43117: variable 'ansible_search_path' from source: unknown 13271 1727203836.43132: variable 'ansible_search_path' from source: unknown 13271 1727203836.43319: calling self._execute() 13271 1727203836.43409: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.43462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.43479: variable 'omit' from source: magic vars 13271 1727203836.44256: variable 'ansible_distribution_major_version' from source: facts 13271 1727203836.44316: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203836.44332: _execute() done 13271 1727203836.44408: dumping result to json 13271 1727203836.44411: done dumping result, returning 13271 1727203836.44413: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-2a40-12ba-000000000266] 13271 1727203836.44415: sending task result for task 028d2410-947f-2a40-12ba-000000000266 13271 1727203836.44599: done sending task result for task 028d2410-947f-2a40-12ba-000000000266 13271 1727203836.44603: WORKER PROCESS EXITING 13271 1727203836.44651: no more pending results, returning what we have 13271 1727203836.44656: in VariableManager get_vars() 13271 1727203836.44707: Calling all_inventory to load vars for managed-node1 13271 1727203836.44709: Calling groups_inventory to load vars for managed-node1 13271 1727203836.44711: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203836.44725: Calling all_plugins_play to load vars for managed-node1 13271 1727203836.44728: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203836.44730: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.48106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.51118: done with get_vars() 13271 1727203836.51146: variable 'ansible_search_path' from source: unknown 13271 1727203836.51148: variable 'ansible_search_path' from source: unknown 13271 1727203836.51192: we have included files to process 13271 1727203836.51193: generating all_blocks data 13271 1727203836.51195: done generating all_blocks data 13271 1727203836.51200: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203836.51202: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203836.51204: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203836.53137: done processing included file 13271 1727203836.53139: iterating over new_blocks loaded from include file 13271 1727203836.53141: in VariableManager get_vars() 13271 1727203836.53164: done with get_vars() 13271 1727203836.53166: filtering new block on tags 13271 1727203836.53192: done filtering new block on tags 13271 1727203836.53196: in VariableManager get_vars() 13271 1727203836.53214: done with get_vars() 13271 1727203836.53216: filtering new block on tags 13271 1727203836.53237: done filtering new block on tags 13271 1727203836.53239: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 13271 1727203836.53244: extending task lists for all hosts with included blocks 13271 1727203836.53626: done extending task lists 13271 1727203836.53628: done processing included files 13271 1727203836.53629: results queue empty 13271 1727203836.53629: checking for any_errors_fatal 13271 1727203836.53633: done checking for any_errors_fatal 13271 1727203836.53633: checking for max_fail_percentage 13271 1727203836.53878: done checking for max_fail_percentage 13271 1727203836.53880: checking to see if all hosts have failed and the running result is not ok 13271 1727203836.53881: done checking to see if all hosts have failed 13271 1727203836.53882: getting the remaining hosts for this loop 13271 1727203836.53883: done getting the remaining hosts for this loop 13271 1727203836.53886: getting the next task for host managed-node1 13271 1727203836.53890: done getting next task for host managed-node1 13271 1727203836.53893: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13271 1727203836.53895: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203836.53898: getting variables 13271 1727203836.53899: in VariableManager get_vars() 13271 1727203836.53913: Calling all_inventory to load vars for managed-node1 13271 1727203836.53915: Calling groups_inventory to load vars for managed-node1 13271 1727203836.53917: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203836.53923: Calling all_plugins_play to load vars for managed-node1 13271 1727203836.53925: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203836.53928: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.55822: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.57767: done with get_vars() 13271 1727203836.57791: done getting variables 13271 1727203836.57841: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:50:36 -0400 (0:00:00.161) 0:00:20.221 ***** 13271 1727203836.57875: entering _queue_task() for managed-node1/set_fact 13271 1727203836.58248: worker is 1 (out of 1 available) 13271 1727203836.58261: exiting _queue_task() for managed-node1/set_fact 13271 1727203836.58273: done queuing things up, now waiting for results queue to drain 13271 1727203836.58408: waiting for pending results... 13271 1727203836.58644: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 13271 1727203836.58739: in run() - task 028d2410-947f-2a40-12ba-0000000003f8 13271 1727203836.58743: variable 'ansible_search_path' from source: unknown 13271 1727203836.58746: variable 'ansible_search_path' from source: unknown 13271 1727203836.58778: calling self._execute() 13271 1727203836.58952: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.58958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.58961: variable 'omit' from source: magic vars 13271 1727203836.59336: variable 'ansible_distribution_major_version' from source: facts 13271 1727203836.59350: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203836.59361: variable 'omit' from source: magic vars 13271 1727203836.59415: variable 'omit' from source: magic vars 13271 1727203836.59455: variable 'omit' from source: magic vars 13271 1727203836.59509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203836.59609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203836.59612: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203836.59615: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.59617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.59649: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203836.59658: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.59666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.59770: Set connection var ansible_connection to ssh 13271 1727203836.59785: Set connection var ansible_shell_type to sh 13271 1727203836.59796: Set connection var ansible_timeout to 10 13271 1727203836.59804: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203836.59813: Set connection var ansible_pipelining to False 13271 1727203836.59841: Set connection var ansible_shell_executable to /bin/sh 13271 1727203836.59864: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.59879: variable 'ansible_connection' from source: unknown 13271 1727203836.59882: variable 'ansible_module_compression' from source: unknown 13271 1727203836.59885: variable 'ansible_shell_type' from source: unknown 13271 1727203836.59936: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.59939: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.59941: variable 'ansible_pipelining' from source: unknown 13271 1727203836.59944: variable 'ansible_timeout' from source: unknown 13271 1727203836.59946: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.60069: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203836.60154: variable 'omit' from source: magic vars 13271 1727203836.60157: starting attempt loop 13271 1727203836.60159: running the handler 13271 1727203836.60161: handler run complete 13271 1727203836.60165: attempt loop complete, returning result 13271 1727203836.60168: _execute() done 13271 1727203836.60170: dumping result to json 13271 1727203836.60171: done dumping result, returning 13271 1727203836.60173: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-2a40-12ba-0000000003f8] 13271 1727203836.60177: sending task result for task 028d2410-947f-2a40-12ba-0000000003f8 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13271 1727203836.60340: no more pending results, returning what we have 13271 1727203836.60344: results queue empty 13271 1727203836.60346: checking for any_errors_fatal 13271 1727203836.60347: done checking for any_errors_fatal 13271 1727203836.60348: checking for max_fail_percentage 13271 1727203836.60349: done checking for max_fail_percentage 13271 1727203836.60350: checking to see if all hosts have failed and the running result is not ok 13271 1727203836.60351: done checking to see if all hosts have failed 13271 1727203836.60352: getting the remaining hosts for this loop 13271 1727203836.60353: done getting the remaining hosts for this loop 13271 1727203836.60357: getting the next task for host managed-node1 13271 1727203836.60481: done getting next task for host managed-node1 13271 1727203836.60485: ^ task is: TASK: Stat profile file 13271 1727203836.60490: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203836.60495: getting variables 13271 1727203836.60497: in VariableManager get_vars() 13271 1727203836.60538: Calling all_inventory to load vars for managed-node1 13271 1727203836.60541: Calling groups_inventory to load vars for managed-node1 13271 1727203836.60544: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203836.60556: Calling all_plugins_play to load vars for managed-node1 13271 1727203836.60559: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203836.60562: Calling groups_plugins_play to load vars for managed-node1 13271 1727203836.61097: done sending task result for task 028d2410-947f-2a40-12ba-0000000003f8 13271 1727203836.61101: WORKER PROCESS EXITING 13271 1727203836.62291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203836.64444: done with get_vars() 13271 1727203836.64589: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:50:36 -0400 (0:00:00.069) 0:00:20.291 ***** 13271 1727203836.64787: entering _queue_task() for managed-node1/stat 13271 1727203836.65470: worker is 1 (out of 1 available) 13271 1727203836.65483: exiting _queue_task() for managed-node1/stat 13271 1727203836.65495: done queuing things up, now waiting for results queue to drain 13271 1727203836.65497: waiting for pending results... 13271 1727203836.65945: running TaskExecutor() for managed-node1/TASK: Stat profile file 13271 1727203836.66601: in run() - task 028d2410-947f-2a40-12ba-0000000003f9 13271 1727203836.66615: variable 'ansible_search_path' from source: unknown 13271 1727203836.66618: variable 'ansible_search_path' from source: unknown 13271 1727203836.66663: calling self._execute() 13271 1727203836.66753: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.66757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.66770: variable 'omit' from source: magic vars 13271 1727203836.67550: variable 'ansible_distribution_major_version' from source: facts 13271 1727203836.67565: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203836.67590: variable 'omit' from source: magic vars 13271 1727203836.67934: variable 'omit' from source: magic vars 13271 1727203836.68189: variable 'profile' from source: include params 13271 1727203836.68192: variable 'item' from source: include params 13271 1727203836.68263: variable 'item' from source: include params 13271 1727203836.68281: variable 'omit' from source: magic vars 13271 1727203836.68680: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203836.68686: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203836.68689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203836.68693: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.68814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203836.68817: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203836.68819: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.68822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.68945: Set connection var ansible_connection to ssh 13271 1727203836.68954: Set connection var ansible_shell_type to sh 13271 1727203836.68966: Set connection var ansible_timeout to 10 13271 1727203836.68970: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203836.68983: Set connection var ansible_pipelining to False 13271 1727203836.68986: Set connection var ansible_shell_executable to /bin/sh 13271 1727203836.69141: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.69144: variable 'ansible_connection' from source: unknown 13271 1727203836.69147: variable 'ansible_module_compression' from source: unknown 13271 1727203836.69149: variable 'ansible_shell_type' from source: unknown 13271 1727203836.69151: variable 'ansible_shell_executable' from source: unknown 13271 1727203836.69153: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203836.69155: variable 'ansible_pipelining' from source: unknown 13271 1727203836.69157: variable 'ansible_timeout' from source: unknown 13271 1727203836.69159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203836.69543: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203836.69550: variable 'omit' from source: magic vars 13271 1727203836.69557: starting attempt loop 13271 1727203836.69560: running the handler 13271 1727203836.69574: _low_level_execute_command(): starting 13271 1727203836.69695: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203836.71297: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203836.71431: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203836.71451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203836.71501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203836.71612: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203836.73393: stdout chunk (state=3): >>>/root <<< 13271 1727203836.73550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203836.73554: stdout chunk (state=3): >>><<< 13271 1727203836.73556: stderr chunk (state=3): >>><<< 13271 1727203836.73581: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203836.73607: _low_level_execute_command(): starting 13271 1727203836.73703: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119 `" && echo ansible-tmp-1727203836.7358813-14650-107318852142119="` echo /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119 `" ) && sleep 0' 13271 1727203836.74264: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203836.74283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203836.74299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203836.74323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203836.74429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203836.74461: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203836.74575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203836.76680: stdout chunk (state=3): >>>ansible-tmp-1727203836.7358813-14650-107318852142119=/root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119 <<< 13271 1727203836.76835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203836.76855: stderr chunk (state=3): >>><<< 13271 1727203836.76865: stdout chunk (state=3): >>><<< 13271 1727203836.77082: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203836.7358813-14650-107318852142119=/root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203836.77086: variable 'ansible_module_compression' from source: unknown 13271 1727203836.77089: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13271 1727203836.77091: variable 'ansible_facts' from source: unknown 13271 1727203836.77149: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/AnsiballZ_stat.py 13271 1727203836.77335: Sending initial data 13271 1727203836.77344: Sent initial data (153 bytes) 13271 1727203836.77974: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203836.77993: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203836.78007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203836.78098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203836.78131: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203836.78154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203836.78262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203836.80031: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203836.80123: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203836.80221: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp36z_vbvq /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/AnsiballZ_stat.py <<< 13271 1727203836.80241: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/AnsiballZ_stat.py" <<< 13271 1727203836.80292: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 13271 1727203836.80324: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp36z_vbvq" to remote "/root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/AnsiballZ_stat.py" <<< 13271 1727203836.81259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203836.81287: stderr chunk (state=3): >>><<< 13271 1727203836.81333: stdout chunk (state=3): >>><<< 13271 1727203836.81370: done transferring module to remote 13271 1727203836.81373: _low_level_execute_command(): starting 13271 1727203836.81379: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/ /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/AnsiballZ_stat.py && sleep 0' 13271 1727203836.81803: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203836.81807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203836.81837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203836.81840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203836.81843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203836.81845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203836.81897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203836.81901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203836.81993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203836.84014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203836.84042: stderr chunk (state=3): >>><<< 13271 1727203836.84050: stdout chunk (state=3): >>><<< 13271 1727203836.84082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203836.84088: _low_level_execute_command(): starting 13271 1727203836.84091: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/AnsiballZ_stat.py && sleep 0' 13271 1727203836.84781: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203836.84785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203836.84801: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203836.84856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203836.84877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203836.84960: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203836.85029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.01547: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13271 1727203837.03093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203837.03124: stderr chunk (state=3): >>><<< 13271 1727203837.03127: stdout chunk (state=3): >>><<< 13271 1727203837.03144: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203837.03170: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203837.03180: _low_level_execute_command(): starting 13271 1727203837.03185: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203836.7358813-14650-107318852142119/ > /dev/null 2>&1 && sleep 0' 13271 1727203837.03636: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203837.03639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203837.03642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.03644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203837.03646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.03690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203837.03701: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203837.03795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.05762: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203837.05790: stderr chunk (state=3): >>><<< 13271 1727203837.05793: stdout chunk (state=3): >>><<< 13271 1727203837.05810: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203837.05813: handler run complete 13271 1727203837.05830: attempt loop complete, returning result 13271 1727203837.05833: _execute() done 13271 1727203837.05836: dumping result to json 13271 1727203837.05839: done dumping result, returning 13271 1727203837.05847: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-2a40-12ba-0000000003f9] 13271 1727203837.05851: sending task result for task 028d2410-947f-2a40-12ba-0000000003f9 13271 1727203837.05943: done sending task result for task 028d2410-947f-2a40-12ba-0000000003f9 13271 1727203837.05946: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 13271 1727203837.06005: no more pending results, returning what we have 13271 1727203837.06008: results queue empty 13271 1727203837.06009: checking for any_errors_fatal 13271 1727203837.06015: done checking for any_errors_fatal 13271 1727203837.06016: checking for max_fail_percentage 13271 1727203837.06017: done checking for max_fail_percentage 13271 1727203837.06018: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.06019: done checking to see if all hosts have failed 13271 1727203837.06019: getting the remaining hosts for this loop 13271 1727203837.06025: done getting the remaining hosts for this loop 13271 1727203837.06028: getting the next task for host managed-node1 13271 1727203837.06034: done getting next task for host managed-node1 13271 1727203837.06037: ^ task is: TASK: Set NM profile exist flag based on the profile files 13271 1727203837.06041: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.06044: getting variables 13271 1727203837.06046: in VariableManager get_vars() 13271 1727203837.06094: Calling all_inventory to load vars for managed-node1 13271 1727203837.06096: Calling groups_inventory to load vars for managed-node1 13271 1727203837.06098: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.06109: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.06112: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.06115: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.07018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.10976: done with get_vars() 13271 1727203837.10995: done getting variables 13271 1727203837.11033: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.462) 0:00:20.753 ***** 13271 1727203837.11052: entering _queue_task() for managed-node1/set_fact 13271 1727203837.11314: worker is 1 (out of 1 available) 13271 1727203837.11328: exiting _queue_task() for managed-node1/set_fact 13271 1727203837.11339: done queuing things up, now waiting for results queue to drain 13271 1727203837.11341: waiting for pending results... 13271 1727203837.11512: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 13271 1727203837.11598: in run() - task 028d2410-947f-2a40-12ba-0000000003fa 13271 1727203837.11610: variable 'ansible_search_path' from source: unknown 13271 1727203837.11613: variable 'ansible_search_path' from source: unknown 13271 1727203837.11641: calling self._execute() 13271 1727203837.11711: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.11715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.11723: variable 'omit' from source: magic vars 13271 1727203837.12002: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.12013: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.12093: variable 'profile_stat' from source: set_fact 13271 1727203837.12104: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203837.12107: when evaluation is False, skipping this task 13271 1727203837.12110: _execute() done 13271 1727203837.12115: dumping result to json 13271 1727203837.12118: done dumping result, returning 13271 1727203837.12120: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-2a40-12ba-0000000003fa] 13271 1727203837.12129: sending task result for task 028d2410-947f-2a40-12ba-0000000003fa 13271 1727203837.12211: done sending task result for task 028d2410-947f-2a40-12ba-0000000003fa 13271 1727203837.12214: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203837.12280: no more pending results, returning what we have 13271 1727203837.12284: results queue empty 13271 1727203837.12285: checking for any_errors_fatal 13271 1727203837.12293: done checking for any_errors_fatal 13271 1727203837.12294: checking for max_fail_percentage 13271 1727203837.12295: done checking for max_fail_percentage 13271 1727203837.12296: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.12297: done checking to see if all hosts have failed 13271 1727203837.12298: getting the remaining hosts for this loop 13271 1727203837.12299: done getting the remaining hosts for this loop 13271 1727203837.12302: getting the next task for host managed-node1 13271 1727203837.12308: done getting next task for host managed-node1 13271 1727203837.12310: ^ task is: TASK: Get NM profile info 13271 1727203837.12313: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.12317: getting variables 13271 1727203837.12318: in VariableManager get_vars() 13271 1727203837.12353: Calling all_inventory to load vars for managed-node1 13271 1727203837.12356: Calling groups_inventory to load vars for managed-node1 13271 1727203837.12357: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.12369: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.12372: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.12374: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.13113: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.13980: done with get_vars() 13271 1727203837.13994: done getting variables 13271 1727203837.14034: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.030) 0:00:20.783 ***** 13271 1727203837.14055: entering _queue_task() for managed-node1/shell 13271 1727203837.14277: worker is 1 (out of 1 available) 13271 1727203837.14290: exiting _queue_task() for managed-node1/shell 13271 1727203837.14300: done queuing things up, now waiting for results queue to drain 13271 1727203837.14302: waiting for pending results... 13271 1727203837.14458: running TaskExecutor() for managed-node1/TASK: Get NM profile info 13271 1727203837.14541: in run() - task 028d2410-947f-2a40-12ba-0000000003fb 13271 1727203837.14552: variable 'ansible_search_path' from source: unknown 13271 1727203837.14556: variable 'ansible_search_path' from source: unknown 13271 1727203837.14585: calling self._execute() 13271 1727203837.14653: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.14658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.14668: variable 'omit' from source: magic vars 13271 1727203837.14924: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.14934: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.14939: variable 'omit' from source: magic vars 13271 1727203837.14972: variable 'omit' from source: magic vars 13271 1727203837.15039: variable 'profile' from source: include params 13271 1727203837.15043: variable 'item' from source: include params 13271 1727203837.15093: variable 'item' from source: include params 13271 1727203837.15107: variable 'omit' from source: magic vars 13271 1727203837.15138: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203837.15167: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203837.15188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203837.15200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203837.15210: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203837.15234: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203837.15237: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.15239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.15307: Set connection var ansible_connection to ssh 13271 1727203837.15314: Set connection var ansible_shell_type to sh 13271 1727203837.15321: Set connection var ansible_timeout to 10 13271 1727203837.15326: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203837.15331: Set connection var ansible_pipelining to False 13271 1727203837.15336: Set connection var ansible_shell_executable to /bin/sh 13271 1727203837.15353: variable 'ansible_shell_executable' from source: unknown 13271 1727203837.15356: variable 'ansible_connection' from source: unknown 13271 1727203837.15359: variable 'ansible_module_compression' from source: unknown 13271 1727203837.15364: variable 'ansible_shell_type' from source: unknown 13271 1727203837.15367: variable 'ansible_shell_executable' from source: unknown 13271 1727203837.15369: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.15372: variable 'ansible_pipelining' from source: unknown 13271 1727203837.15376: variable 'ansible_timeout' from source: unknown 13271 1727203837.15379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.15474: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203837.15484: variable 'omit' from source: magic vars 13271 1727203837.15489: starting attempt loop 13271 1727203837.15491: running the handler 13271 1727203837.15499: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203837.15517: _low_level_execute_command(): starting 13271 1727203837.15523: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203837.16027: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203837.16031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.16034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203837.16036: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.16081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203837.16098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203837.16105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203837.16194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.18003: stdout chunk (state=3): >>>/root <<< 13271 1727203837.18098: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203837.18133: stderr chunk (state=3): >>><<< 13271 1727203837.18137: stdout chunk (state=3): >>><<< 13271 1727203837.18168: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203837.18248: _low_level_execute_command(): starting 13271 1727203837.18252: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321 `" && echo ansible-tmp-1727203837.1816654-14670-65596849449321="` echo /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321 `" ) && sleep 0' 13271 1727203837.18825: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203837.18828: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203837.18831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203837.18833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203837.18844: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203837.18851: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203837.18935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.18939: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.18965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203837.18989: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203837.19097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.21222: stdout chunk (state=3): >>>ansible-tmp-1727203837.1816654-14670-65596849449321=/root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321 <<< 13271 1727203837.21792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203837.21796: stdout chunk (state=3): >>><<< 13271 1727203837.21799: stderr chunk (state=3): >>><<< 13271 1727203837.21801: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203837.1816654-14670-65596849449321=/root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203837.21803: variable 'ansible_module_compression' from source: unknown 13271 1727203837.21816: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203837.21857: variable 'ansible_facts' from source: unknown 13271 1727203837.21952: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/AnsiballZ_command.py 13271 1727203837.22137: Sending initial data 13271 1727203837.22140: Sent initial data (155 bytes) 13271 1727203837.22719: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203837.22732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203837.22784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203837.22803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.22883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203837.22905: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203837.23016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.24779: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203837.24884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203837.24974: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpuv2a31as /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/AnsiballZ_command.py <<< 13271 1727203837.24980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/AnsiballZ_command.py" <<< 13271 1727203837.25060: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpuv2a31as" to remote "/root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/AnsiballZ_command.py" <<< 13271 1727203837.26188: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203837.26192: stdout chunk (state=3): >>><<< 13271 1727203837.26194: stderr chunk (state=3): >>><<< 13271 1727203837.26196: done transferring module to remote 13271 1727203837.26198: _low_level_execute_command(): starting 13271 1727203837.26200: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/ /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/AnsiballZ_command.py && sleep 0' 13271 1727203837.26822: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203837.26843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203837.26887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 13271 1727203837.26899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203837.26910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203837.26922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203837.26956: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.27015: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203837.27034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203837.27066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203837.27197: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.29239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203837.29243: stdout chunk (state=3): >>><<< 13271 1727203837.29254: stderr chunk (state=3): >>><<< 13271 1727203837.29274: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203837.29280: _low_level_execute_command(): starting 13271 1727203837.29285: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/AnsiballZ_command.py && sleep 0' 13271 1727203837.29900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203837.30059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203837.30064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.30066: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203837.30069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203837.30071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203837.30367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.49809: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:50:37.473652", "end": "2024-09-24 14:50:37.495654", "delta": "0:00:00.022002", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203837.51556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203837.52285: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 13271 1727203837.52288: stdout chunk (state=3): >>><<< 13271 1727203837.52291: stderr chunk (state=3): >>><<< 13271 1727203837.52293: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:50:37.473652", "end": "2024-09-24 14:50:37.495654", "delta": "0:00:00.022002", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203837.52297: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203837.52303: _low_level_execute_command(): starting 13271 1727203837.52306: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203837.1816654-14670-65596849449321/ > /dev/null 2>&1 && sleep 0' 13271 1727203837.53129: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203837.53142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.53154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203837.53165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203837.53268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203837.53390: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203837.53433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203837.55514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203837.55593: stderr chunk (state=3): >>><<< 13271 1727203837.55601: stdout chunk (state=3): >>><<< 13271 1727203837.55623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203837.55636: handler run complete 13271 1727203837.55661: Evaluated conditional (False): False 13271 1727203837.55980: attempt loop complete, returning result 13271 1727203837.55984: _execute() done 13271 1727203837.55986: dumping result to json 13271 1727203837.55988: done dumping result, returning 13271 1727203837.55990: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-2a40-12ba-0000000003fb] 13271 1727203837.55993: sending task result for task 028d2410-947f-2a40-12ba-0000000003fb 13271 1727203837.56063: done sending task result for task 028d2410-947f-2a40-12ba-0000000003fb ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022002", "end": "2024-09-24 14:50:37.495654", "rc": 0, "start": "2024-09-24 14:50:37.473652" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 13271 1727203837.56130: no more pending results, returning what we have 13271 1727203837.56133: results queue empty 13271 1727203837.56134: checking for any_errors_fatal 13271 1727203837.56140: done checking for any_errors_fatal 13271 1727203837.56140: checking for max_fail_percentage 13271 1727203837.56142: done checking for max_fail_percentage 13271 1727203837.56379: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.56381: done checking to see if all hosts have failed 13271 1727203837.56381: getting the remaining hosts for this loop 13271 1727203837.56383: done getting the remaining hosts for this loop 13271 1727203837.56387: getting the next task for host managed-node1 13271 1727203837.56395: done getting next task for host managed-node1 13271 1727203837.56397: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13271 1727203837.56401: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.56405: getting variables 13271 1727203837.56407: in VariableManager get_vars() 13271 1727203837.56448: Calling all_inventory to load vars for managed-node1 13271 1727203837.56450: Calling groups_inventory to load vars for managed-node1 13271 1727203837.56453: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.56467: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.56470: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.56473: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.57029: WORKER PROCESS EXITING 13271 1727203837.59036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.60905: done with get_vars() 13271 1727203837.60931: done getting variables 13271 1727203837.60995: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.469) 0:00:21.253 ***** 13271 1727203837.61029: entering _queue_task() for managed-node1/set_fact 13271 1727203837.61498: worker is 1 (out of 1 available) 13271 1727203837.61510: exiting _queue_task() for managed-node1/set_fact 13271 1727203837.61521: done queuing things up, now waiting for results queue to drain 13271 1727203837.61522: waiting for pending results... 13271 1727203837.61758: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13271 1727203837.62014: in run() - task 028d2410-947f-2a40-12ba-0000000003fc 13271 1727203837.62030: variable 'ansible_search_path' from source: unknown 13271 1727203837.62033: variable 'ansible_search_path' from source: unknown 13271 1727203837.62073: calling self._execute() 13271 1727203837.62249: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.62255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.62273: variable 'omit' from source: magic vars 13271 1727203837.62675: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.62688: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.62981: variable 'nm_profile_exists' from source: set_fact 13271 1727203837.62985: Evaluated conditional (nm_profile_exists.rc == 0): True 13271 1727203837.62987: variable 'omit' from source: magic vars 13271 1727203837.62990: variable 'omit' from source: magic vars 13271 1727203837.62992: variable 'omit' from source: magic vars 13271 1727203837.62995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203837.62997: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203837.63015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203837.63032: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203837.63044: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203837.63099: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203837.63102: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.63105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.63356: Set connection var ansible_connection to ssh 13271 1727203837.63360: Set connection var ansible_shell_type to sh 13271 1727203837.63363: Set connection var ansible_timeout to 10 13271 1727203837.63365: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203837.63367: Set connection var ansible_pipelining to False 13271 1727203837.63369: Set connection var ansible_shell_executable to /bin/sh 13271 1727203837.63372: variable 'ansible_shell_executable' from source: unknown 13271 1727203837.63374: variable 'ansible_connection' from source: unknown 13271 1727203837.63379: variable 'ansible_module_compression' from source: unknown 13271 1727203837.63381: variable 'ansible_shell_type' from source: unknown 13271 1727203837.63384: variable 'ansible_shell_executable' from source: unknown 13271 1727203837.63386: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.63389: variable 'ansible_pipelining' from source: unknown 13271 1727203837.63391: variable 'ansible_timeout' from source: unknown 13271 1727203837.63394: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.63440: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203837.63451: variable 'omit' from source: magic vars 13271 1727203837.63457: starting attempt loop 13271 1727203837.63461: running the handler 13271 1727203837.63476: handler run complete 13271 1727203837.63486: attempt loop complete, returning result 13271 1727203837.63489: _execute() done 13271 1727203837.63492: dumping result to json 13271 1727203837.63495: done dumping result, returning 13271 1727203837.63511: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-2a40-12ba-0000000003fc] 13271 1727203837.63526: sending task result for task 028d2410-947f-2a40-12ba-0000000003fc 13271 1727203837.63606: done sending task result for task 028d2410-947f-2a40-12ba-0000000003fc 13271 1727203837.63609: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13271 1727203837.63667: no more pending results, returning what we have 13271 1727203837.63670: results queue empty 13271 1727203837.63671: checking for any_errors_fatal 13271 1727203837.63689: done checking for any_errors_fatal 13271 1727203837.63690: checking for max_fail_percentage 13271 1727203837.63692: done checking for max_fail_percentage 13271 1727203837.63693: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.63693: done checking to see if all hosts have failed 13271 1727203837.63694: getting the remaining hosts for this loop 13271 1727203837.63696: done getting the remaining hosts for this loop 13271 1727203837.63699: getting the next task for host managed-node1 13271 1727203837.63709: done getting next task for host managed-node1 13271 1727203837.63713: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13271 1727203837.63717: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.63720: getting variables 13271 1727203837.63722: in VariableManager get_vars() 13271 1727203837.63765: Calling all_inventory to load vars for managed-node1 13271 1727203837.63768: Calling groups_inventory to load vars for managed-node1 13271 1727203837.63770: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.63888: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.63901: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.63906: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.65573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.67267: done with get_vars() 13271 1727203837.67296: done getting variables 13271 1727203837.67357: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203837.67490: variable 'profile' from source: include params 13271 1727203837.67494: variable 'item' from source: include params 13271 1727203837.67554: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.065) 0:00:21.319 ***** 13271 1727203837.67596: entering _queue_task() for managed-node1/command 13271 1727203837.68018: worker is 1 (out of 1 available) 13271 1727203837.68028: exiting _queue_task() for managed-node1/command 13271 1727203837.68038: done queuing things up, now waiting for results queue to drain 13271 1727203837.68040: waiting for pending results... 13271 1727203837.68494: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 13271 1727203837.68499: in run() - task 028d2410-947f-2a40-12ba-0000000003fe 13271 1727203837.68502: variable 'ansible_search_path' from source: unknown 13271 1727203837.68505: variable 'ansible_search_path' from source: unknown 13271 1727203837.68508: calling self._execute() 13271 1727203837.68681: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.68685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.68688: variable 'omit' from source: magic vars 13271 1727203837.69081: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.69085: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.69087: variable 'profile_stat' from source: set_fact 13271 1727203837.69090: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203837.69092: when evaluation is False, skipping this task 13271 1727203837.69095: _execute() done 13271 1727203837.69098: dumping result to json 13271 1727203837.69100: done dumping result, returning 13271 1727203837.69103: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [028d2410-947f-2a40-12ba-0000000003fe] 13271 1727203837.69105: sending task result for task 028d2410-947f-2a40-12ba-0000000003fe 13271 1727203837.69177: done sending task result for task 028d2410-947f-2a40-12ba-0000000003fe 13271 1727203837.69181: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203837.69263: no more pending results, returning what we have 13271 1727203837.69266: results queue empty 13271 1727203837.69267: checking for any_errors_fatal 13271 1727203837.69274: done checking for any_errors_fatal 13271 1727203837.69277: checking for max_fail_percentage 13271 1727203837.69279: done checking for max_fail_percentage 13271 1727203837.69279: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.69280: done checking to see if all hosts have failed 13271 1727203837.69281: getting the remaining hosts for this loop 13271 1727203837.69283: done getting the remaining hosts for this loop 13271 1727203837.69286: getting the next task for host managed-node1 13271 1727203837.69294: done getting next task for host managed-node1 13271 1727203837.69298: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13271 1727203837.69302: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.69307: getting variables 13271 1727203837.69309: in VariableManager get_vars() 13271 1727203837.69350: Calling all_inventory to load vars for managed-node1 13271 1727203837.69353: Calling groups_inventory to load vars for managed-node1 13271 1727203837.69356: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.69370: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.69373: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.69584: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.71120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.74609: done with get_vars() 13271 1727203837.74644: done getting variables 13271 1727203837.74714: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203837.74854: variable 'profile' from source: include params 13271 1727203837.74858: variable 'item' from source: include params 13271 1727203837.74928: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.073) 0:00:21.392 ***** 13271 1727203837.74965: entering _queue_task() for managed-node1/set_fact 13271 1727203837.75491: worker is 1 (out of 1 available) 13271 1727203837.75502: exiting _queue_task() for managed-node1/set_fact 13271 1727203837.75512: done queuing things up, now waiting for results queue to drain 13271 1727203837.75514: waiting for pending results... 13271 1727203837.76052: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 13271 1727203837.76283: in run() - task 028d2410-947f-2a40-12ba-0000000003ff 13271 1727203837.76381: variable 'ansible_search_path' from source: unknown 13271 1727203837.76385: variable 'ansible_search_path' from source: unknown 13271 1727203837.76427: calling self._execute() 13271 1727203837.76681: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.76709: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.76719: variable 'omit' from source: magic vars 13271 1727203837.77351: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.77362: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.77583: variable 'profile_stat' from source: set_fact 13271 1727203837.77587: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203837.77589: when evaluation is False, skipping this task 13271 1727203837.77592: _execute() done 13271 1727203837.77594: dumping result to json 13271 1727203837.77596: done dumping result, returning 13271 1727203837.77599: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [028d2410-947f-2a40-12ba-0000000003ff] 13271 1727203837.77601: sending task result for task 028d2410-947f-2a40-12ba-0000000003ff 13271 1727203837.77668: done sending task result for task 028d2410-947f-2a40-12ba-0000000003ff 13271 1727203837.77671: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203837.77724: no more pending results, returning what we have 13271 1727203837.77739: results queue empty 13271 1727203837.77740: checking for any_errors_fatal 13271 1727203837.77746: done checking for any_errors_fatal 13271 1727203837.77746: checking for max_fail_percentage 13271 1727203837.77749: done checking for max_fail_percentage 13271 1727203837.77749: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.77750: done checking to see if all hosts have failed 13271 1727203837.77751: getting the remaining hosts for this loop 13271 1727203837.77752: done getting the remaining hosts for this loop 13271 1727203837.77756: getting the next task for host managed-node1 13271 1727203837.77771: done getting next task for host managed-node1 13271 1727203837.77776: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13271 1727203837.77781: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.77786: getting variables 13271 1727203837.77788: in VariableManager get_vars() 13271 1727203837.77830: Calling all_inventory to load vars for managed-node1 13271 1727203837.77833: Calling groups_inventory to load vars for managed-node1 13271 1727203837.77835: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.77849: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.77852: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.77857: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.79850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.82588: done with get_vars() 13271 1727203837.82611: done getting variables 13271 1727203837.82693: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203837.82820: variable 'profile' from source: include params 13271 1727203837.82824: variable 'item' from source: include params 13271 1727203837.82897: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.079) 0:00:21.472 ***** 13271 1727203837.82931: entering _queue_task() for managed-node1/command 13271 1727203837.83625: worker is 1 (out of 1 available) 13271 1727203837.83637: exiting _queue_task() for managed-node1/command 13271 1727203837.83647: done queuing things up, now waiting for results queue to drain 13271 1727203837.83648: waiting for pending results... 13271 1727203837.84194: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 13271 1727203837.84306: in run() - task 028d2410-947f-2a40-12ba-000000000400 13271 1727203837.84321: variable 'ansible_search_path' from source: unknown 13271 1727203837.84325: variable 'ansible_search_path' from source: unknown 13271 1727203837.84439: calling self._execute() 13271 1727203837.84621: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.84627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.84683: variable 'omit' from source: magic vars 13271 1727203837.85456: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.85537: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.85825: variable 'profile_stat' from source: set_fact 13271 1727203837.85839: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203837.85843: when evaluation is False, skipping this task 13271 1727203837.85846: _execute() done 13271 1727203837.85849: dumping result to json 13271 1727203837.85852: done dumping result, returning 13271 1727203837.85980: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [028d2410-947f-2a40-12ba-000000000400] 13271 1727203837.85984: sending task result for task 028d2410-947f-2a40-12ba-000000000400 13271 1727203837.86048: done sending task result for task 028d2410-947f-2a40-12ba-000000000400 13271 1727203837.86052: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203837.86134: no more pending results, returning what we have 13271 1727203837.86138: results queue empty 13271 1727203837.86139: checking for any_errors_fatal 13271 1727203837.86147: done checking for any_errors_fatal 13271 1727203837.86148: checking for max_fail_percentage 13271 1727203837.86150: done checking for max_fail_percentage 13271 1727203837.86150: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.86151: done checking to see if all hosts have failed 13271 1727203837.86152: getting the remaining hosts for this loop 13271 1727203837.86153: done getting the remaining hosts for this loop 13271 1727203837.86157: getting the next task for host managed-node1 13271 1727203837.86167: done getting next task for host managed-node1 13271 1727203837.86170: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13271 1727203837.86175: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.86181: getting variables 13271 1727203837.86183: in VariableManager get_vars() 13271 1727203837.86226: Calling all_inventory to load vars for managed-node1 13271 1727203837.86229: Calling groups_inventory to load vars for managed-node1 13271 1727203837.86231: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.86245: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.86248: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.86250: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.87985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.91549: done with get_vars() 13271 1727203837.91582: done getting variables 13271 1727203837.91651: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203837.91788: variable 'profile' from source: include params 13271 1727203837.91792: variable 'item' from source: include params 13271 1727203837.91864: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.089) 0:00:21.562 ***** 13271 1727203837.91898: entering _queue_task() for managed-node1/set_fact 13271 1727203837.92302: worker is 1 (out of 1 available) 13271 1727203837.92314: exiting _queue_task() for managed-node1/set_fact 13271 1727203837.92326: done queuing things up, now waiting for results queue to drain 13271 1727203837.92328: waiting for pending results... 13271 1727203837.92696: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 13271 1727203837.92733: in run() - task 028d2410-947f-2a40-12ba-000000000401 13271 1727203837.92747: variable 'ansible_search_path' from source: unknown 13271 1727203837.92750: variable 'ansible_search_path' from source: unknown 13271 1727203837.92791: calling self._execute() 13271 1727203837.92898: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.92909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.93081: variable 'omit' from source: magic vars 13271 1727203837.93320: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.93361: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.93614: variable 'profile_stat' from source: set_fact 13271 1727203837.93626: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203837.93629: when evaluation is False, skipping this task 13271 1727203837.93632: _execute() done 13271 1727203837.93634: dumping result to json 13271 1727203837.93636: done dumping result, returning 13271 1727203837.93643: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [028d2410-947f-2a40-12ba-000000000401] 13271 1727203837.93649: sending task result for task 028d2410-947f-2a40-12ba-000000000401 13271 1727203837.93853: done sending task result for task 028d2410-947f-2a40-12ba-000000000401 13271 1727203837.93858: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203837.94029: no more pending results, returning what we have 13271 1727203837.94033: results queue empty 13271 1727203837.94033: checking for any_errors_fatal 13271 1727203837.94038: done checking for any_errors_fatal 13271 1727203837.94039: checking for max_fail_percentage 13271 1727203837.94041: done checking for max_fail_percentage 13271 1727203837.94041: checking to see if all hosts have failed and the running result is not ok 13271 1727203837.94042: done checking to see if all hosts have failed 13271 1727203837.94043: getting the remaining hosts for this loop 13271 1727203837.94044: done getting the remaining hosts for this loop 13271 1727203837.94048: getting the next task for host managed-node1 13271 1727203837.94057: done getting next task for host managed-node1 13271 1727203837.94063: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13271 1727203837.94067: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203837.94072: getting variables 13271 1727203837.94073: in VariableManager get_vars() 13271 1727203837.94121: Calling all_inventory to load vars for managed-node1 13271 1727203837.94123: Calling groups_inventory to load vars for managed-node1 13271 1727203837.94126: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203837.94140: Calling all_plugins_play to load vars for managed-node1 13271 1727203837.94143: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203837.94146: Calling groups_plugins_play to load vars for managed-node1 13271 1727203837.96679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203837.98358: done with get_vars() 13271 1727203837.98384: done getting variables 13271 1727203837.98444: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203837.98562: variable 'profile' from source: include params 13271 1727203837.98566: variable 'item' from source: include params 13271 1727203837.98627: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:50:37 -0400 (0:00:00.067) 0:00:21.629 ***** 13271 1727203837.98659: entering _queue_task() for managed-node1/assert 13271 1727203837.98978: worker is 1 (out of 1 available) 13271 1727203837.98993: exiting _queue_task() for managed-node1/assert 13271 1727203837.99006: done queuing things up, now waiting for results queue to drain 13271 1727203837.99007: waiting for pending results... 13271 1727203837.99197: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' 13271 1727203837.99272: in run() - task 028d2410-947f-2a40-12ba-000000000267 13271 1727203837.99285: variable 'ansible_search_path' from source: unknown 13271 1727203837.99288: variable 'ansible_search_path' from source: unknown 13271 1727203837.99322: calling self._execute() 13271 1727203837.99395: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203837.99404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203837.99413: variable 'omit' from source: magic vars 13271 1727203837.99682: variable 'ansible_distribution_major_version' from source: facts 13271 1727203837.99691: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203837.99697: variable 'omit' from source: magic vars 13271 1727203837.99728: variable 'omit' from source: magic vars 13271 1727203837.99801: variable 'profile' from source: include params 13271 1727203837.99805: variable 'item' from source: include params 13271 1727203837.99850: variable 'item' from source: include params 13271 1727203837.99868: variable 'omit' from source: magic vars 13271 1727203837.99907: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203837.99933: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203837.99951: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203837.99968: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203837.99979: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.00004: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203838.00007: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.00010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.00079: Set connection var ansible_connection to ssh 13271 1727203838.00086: Set connection var ansible_shell_type to sh 13271 1727203838.00094: Set connection var ansible_timeout to 10 13271 1727203838.00099: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203838.00104: Set connection var ansible_pipelining to False 13271 1727203838.00109: Set connection var ansible_shell_executable to /bin/sh 13271 1727203838.00128: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.00131: variable 'ansible_connection' from source: unknown 13271 1727203838.00134: variable 'ansible_module_compression' from source: unknown 13271 1727203838.00137: variable 'ansible_shell_type' from source: unknown 13271 1727203838.00139: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.00142: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.00144: variable 'ansible_pipelining' from source: unknown 13271 1727203838.00147: variable 'ansible_timeout' from source: unknown 13271 1727203838.00149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.00251: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203838.00262: variable 'omit' from source: magic vars 13271 1727203838.00269: starting attempt loop 13271 1727203838.00272: running the handler 13271 1727203838.00349: variable 'lsr_net_profile_exists' from source: set_fact 13271 1727203838.00353: Evaluated conditional (lsr_net_profile_exists): True 13271 1727203838.00360: handler run complete 13271 1727203838.00378: attempt loop complete, returning result 13271 1727203838.00381: _execute() done 13271 1727203838.00384: dumping result to json 13271 1727203838.00386: done dumping result, returning 13271 1727203838.00389: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' [028d2410-947f-2a40-12ba-000000000267] 13271 1727203838.00393: sending task result for task 028d2410-947f-2a40-12ba-000000000267 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203838.00701: no more pending results, returning what we have 13271 1727203838.00704: results queue empty 13271 1727203838.00705: checking for any_errors_fatal 13271 1727203838.00711: done checking for any_errors_fatal 13271 1727203838.00712: checking for max_fail_percentage 13271 1727203838.00714: done checking for max_fail_percentage 13271 1727203838.00714: checking to see if all hosts have failed and the running result is not ok 13271 1727203838.00715: done checking to see if all hosts have failed 13271 1727203838.00716: getting the remaining hosts for this loop 13271 1727203838.00717: done getting the remaining hosts for this loop 13271 1727203838.00720: getting the next task for host managed-node1 13271 1727203838.00726: done getting next task for host managed-node1 13271 1727203838.00728: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13271 1727203838.00731: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203838.00735: getting variables 13271 1727203838.00738: in VariableManager get_vars() 13271 1727203838.00787: Calling all_inventory to load vars for managed-node1 13271 1727203838.00790: Calling groups_inventory to load vars for managed-node1 13271 1727203838.00792: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.00803: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.00806: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.00808: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.01389: done sending task result for task 028d2410-947f-2a40-12ba-000000000267 13271 1727203838.01392: WORKER PROCESS EXITING 13271 1727203838.02139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.03307: done with get_vars() 13271 1727203838.03324: done getting variables 13271 1727203838.03366: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203838.03450: variable 'profile' from source: include params 13271 1727203838.03452: variable 'item' from source: include params 13271 1727203838.03495: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:50:38 -0400 (0:00:00.048) 0:00:21.678 ***** 13271 1727203838.03522: entering _queue_task() for managed-node1/assert 13271 1727203838.03755: worker is 1 (out of 1 available) 13271 1727203838.03769: exiting _queue_task() for managed-node1/assert 13271 1727203838.03783: done queuing things up, now waiting for results queue to drain 13271 1727203838.03785: waiting for pending results... 13271 1727203838.03959: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 13271 1727203838.04035: in run() - task 028d2410-947f-2a40-12ba-000000000268 13271 1727203838.04046: variable 'ansible_search_path' from source: unknown 13271 1727203838.04050: variable 'ansible_search_path' from source: unknown 13271 1727203838.04082: calling self._execute() 13271 1727203838.04151: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.04155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.04167: variable 'omit' from source: magic vars 13271 1727203838.04562: variable 'ansible_distribution_major_version' from source: facts 13271 1727203838.04566: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203838.04578: variable 'omit' from source: magic vars 13271 1727203838.04581: variable 'omit' from source: magic vars 13271 1727203838.04783: variable 'profile' from source: include params 13271 1727203838.04786: variable 'item' from source: include params 13271 1727203838.04789: variable 'item' from source: include params 13271 1727203838.04791: variable 'omit' from source: magic vars 13271 1727203838.04812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203838.04833: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203838.04853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203838.04873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.04891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.04917: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203838.04920: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.04922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.05018: Set connection var ansible_connection to ssh 13271 1727203838.05026: Set connection var ansible_shell_type to sh 13271 1727203838.05035: Set connection var ansible_timeout to 10 13271 1727203838.05040: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203838.05046: Set connection var ansible_pipelining to False 13271 1727203838.05051: Set connection var ansible_shell_executable to /bin/sh 13271 1727203838.05079: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.05083: variable 'ansible_connection' from source: unknown 13271 1727203838.05085: variable 'ansible_module_compression' from source: unknown 13271 1727203838.05087: variable 'ansible_shell_type' from source: unknown 13271 1727203838.05090: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.05092: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.05124: variable 'ansible_pipelining' from source: unknown 13271 1727203838.05128: variable 'ansible_timeout' from source: unknown 13271 1727203838.05131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.05233: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203838.05246: variable 'omit' from source: magic vars 13271 1727203838.05251: starting attempt loop 13271 1727203838.05254: running the handler 13271 1727203838.05358: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13271 1727203838.05362: Evaluated conditional (lsr_net_profile_ansible_managed): True 13271 1727203838.05461: handler run complete 13271 1727203838.05464: attempt loop complete, returning result 13271 1727203838.05466: _execute() done 13271 1727203838.05468: dumping result to json 13271 1727203838.05470: done dumping result, returning 13271 1727203838.05472: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [028d2410-947f-2a40-12ba-000000000268] 13271 1727203838.05473: sending task result for task 028d2410-947f-2a40-12ba-000000000268 13271 1727203838.05530: done sending task result for task 028d2410-947f-2a40-12ba-000000000268 13271 1727203838.05532: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203838.05599: no more pending results, returning what we have 13271 1727203838.05602: results queue empty 13271 1727203838.05603: checking for any_errors_fatal 13271 1727203838.05609: done checking for any_errors_fatal 13271 1727203838.05610: checking for max_fail_percentage 13271 1727203838.05612: done checking for max_fail_percentage 13271 1727203838.05612: checking to see if all hosts have failed and the running result is not ok 13271 1727203838.05613: done checking to see if all hosts have failed 13271 1727203838.05614: getting the remaining hosts for this loop 13271 1727203838.05615: done getting the remaining hosts for this loop 13271 1727203838.05618: getting the next task for host managed-node1 13271 1727203838.05624: done getting next task for host managed-node1 13271 1727203838.05626: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13271 1727203838.05628: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203838.05632: getting variables 13271 1727203838.05634: in VariableManager get_vars() 13271 1727203838.05666: Calling all_inventory to load vars for managed-node1 13271 1727203838.05668: Calling groups_inventory to load vars for managed-node1 13271 1727203838.05671: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.05681: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.05684: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.05686: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.06641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.07497: done with get_vars() 13271 1727203838.07511: done getting variables 13271 1727203838.07550: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203838.07629: variable 'profile' from source: include params 13271 1727203838.07632: variable 'item' from source: include params 13271 1727203838.07671: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:50:38 -0400 (0:00:00.041) 0:00:21.720 ***** 13271 1727203838.07700: entering _queue_task() for managed-node1/assert 13271 1727203838.07914: worker is 1 (out of 1 available) 13271 1727203838.07928: exiting _queue_task() for managed-node1/assert 13271 1727203838.07939: done queuing things up, now waiting for results queue to drain 13271 1727203838.07941: waiting for pending results... 13271 1727203838.08115: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 13271 1727203838.08180: in run() - task 028d2410-947f-2a40-12ba-000000000269 13271 1727203838.08194: variable 'ansible_search_path' from source: unknown 13271 1727203838.08197: variable 'ansible_search_path' from source: unknown 13271 1727203838.08224: calling self._execute() 13271 1727203838.08301: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.08306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.08315: variable 'omit' from source: magic vars 13271 1727203838.08573: variable 'ansible_distribution_major_version' from source: facts 13271 1727203838.08584: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203838.08590: variable 'omit' from source: magic vars 13271 1727203838.08622: variable 'omit' from source: magic vars 13271 1727203838.08694: variable 'profile' from source: include params 13271 1727203838.08698: variable 'item' from source: include params 13271 1727203838.08744: variable 'item' from source: include params 13271 1727203838.08759: variable 'omit' from source: magic vars 13271 1727203838.08794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203838.08826: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203838.08839: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203838.08853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.08862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.08889: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203838.08893: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.08895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.08964: Set connection var ansible_connection to ssh 13271 1727203838.08973: Set connection var ansible_shell_type to sh 13271 1727203838.08982: Set connection var ansible_timeout to 10 13271 1727203838.08986: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203838.08991: Set connection var ansible_pipelining to False 13271 1727203838.08996: Set connection var ansible_shell_executable to /bin/sh 13271 1727203838.09014: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.09016: variable 'ansible_connection' from source: unknown 13271 1727203838.09019: variable 'ansible_module_compression' from source: unknown 13271 1727203838.09021: variable 'ansible_shell_type' from source: unknown 13271 1727203838.09024: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.09027: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.09030: variable 'ansible_pipelining' from source: unknown 13271 1727203838.09035: variable 'ansible_timeout' from source: unknown 13271 1727203838.09038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.09135: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203838.09145: variable 'omit' from source: magic vars 13271 1727203838.09158: starting attempt loop 13271 1727203838.09161: running the handler 13271 1727203838.09228: variable 'lsr_net_profile_fingerprint' from source: set_fact 13271 1727203838.09232: Evaluated conditional (lsr_net_profile_fingerprint): True 13271 1727203838.09237: handler run complete 13271 1727203838.09248: attempt loop complete, returning result 13271 1727203838.09251: _execute() done 13271 1727203838.09253: dumping result to json 13271 1727203838.09258: done dumping result, returning 13271 1727203838.09269: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 [028d2410-947f-2a40-12ba-000000000269] 13271 1727203838.09272: sending task result for task 028d2410-947f-2a40-12ba-000000000269 13271 1727203838.09343: done sending task result for task 028d2410-947f-2a40-12ba-000000000269 13271 1727203838.09345: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203838.09416: no more pending results, returning what we have 13271 1727203838.09418: results queue empty 13271 1727203838.09419: checking for any_errors_fatal 13271 1727203838.09425: done checking for any_errors_fatal 13271 1727203838.09426: checking for max_fail_percentage 13271 1727203838.09427: done checking for max_fail_percentage 13271 1727203838.09428: checking to see if all hosts have failed and the running result is not ok 13271 1727203838.09429: done checking to see if all hosts have failed 13271 1727203838.09429: getting the remaining hosts for this loop 13271 1727203838.09430: done getting the remaining hosts for this loop 13271 1727203838.09433: getting the next task for host managed-node1 13271 1727203838.09441: done getting next task for host managed-node1 13271 1727203838.09444: ^ task is: TASK: Include the task 'get_profile_stat.yml' 13271 1727203838.09446: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203838.09450: getting variables 13271 1727203838.09451: in VariableManager get_vars() 13271 1727203838.09485: Calling all_inventory to load vars for managed-node1 13271 1727203838.09488: Calling groups_inventory to load vars for managed-node1 13271 1727203838.09490: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.09499: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.09501: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.09503: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.10256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.11135: done with get_vars() 13271 1727203838.11149: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:50:38 -0400 (0:00:00.035) 0:00:21.755 ***** 13271 1727203838.11219: entering _queue_task() for managed-node1/include_tasks 13271 1727203838.11424: worker is 1 (out of 1 available) 13271 1727203838.11437: exiting _queue_task() for managed-node1/include_tasks 13271 1727203838.11448: done queuing things up, now waiting for results queue to drain 13271 1727203838.11449: waiting for pending results... 13271 1727203838.11618: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 13271 1727203838.11701: in run() - task 028d2410-947f-2a40-12ba-00000000026d 13271 1727203838.11712: variable 'ansible_search_path' from source: unknown 13271 1727203838.11715: variable 'ansible_search_path' from source: unknown 13271 1727203838.11744: calling self._execute() 13271 1727203838.11815: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.11820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.11828: variable 'omit' from source: magic vars 13271 1727203838.12087: variable 'ansible_distribution_major_version' from source: facts 13271 1727203838.12096: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203838.12103: _execute() done 13271 1727203838.12106: dumping result to json 13271 1727203838.12110: done dumping result, returning 13271 1727203838.12112: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [028d2410-947f-2a40-12ba-00000000026d] 13271 1727203838.12122: sending task result for task 028d2410-947f-2a40-12ba-00000000026d 13271 1727203838.12205: done sending task result for task 028d2410-947f-2a40-12ba-00000000026d 13271 1727203838.12208: WORKER PROCESS EXITING 13271 1727203838.12247: no more pending results, returning what we have 13271 1727203838.12251: in VariableManager get_vars() 13271 1727203838.12295: Calling all_inventory to load vars for managed-node1 13271 1727203838.12298: Calling groups_inventory to load vars for managed-node1 13271 1727203838.12300: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.12310: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.12312: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.12315: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.13650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.14507: done with get_vars() 13271 1727203838.14520: variable 'ansible_search_path' from source: unknown 13271 1727203838.14521: variable 'ansible_search_path' from source: unknown 13271 1727203838.14545: we have included files to process 13271 1727203838.14546: generating all_blocks data 13271 1727203838.14548: done generating all_blocks data 13271 1727203838.14552: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203838.14553: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203838.14555: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 13271 1727203838.15124: done processing included file 13271 1727203838.15126: iterating over new_blocks loaded from include file 13271 1727203838.15127: in VariableManager get_vars() 13271 1727203838.15140: done with get_vars() 13271 1727203838.15141: filtering new block on tags 13271 1727203838.15155: done filtering new block on tags 13271 1727203838.15157: in VariableManager get_vars() 13271 1727203838.15168: done with get_vars() 13271 1727203838.15169: filtering new block on tags 13271 1727203838.15183: done filtering new block on tags 13271 1727203838.15185: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 13271 1727203838.15189: extending task lists for all hosts with included blocks 13271 1727203838.15287: done extending task lists 13271 1727203838.15288: done processing included files 13271 1727203838.15289: results queue empty 13271 1727203838.15289: checking for any_errors_fatal 13271 1727203838.15291: done checking for any_errors_fatal 13271 1727203838.15291: checking for max_fail_percentage 13271 1727203838.15292: done checking for max_fail_percentage 13271 1727203838.15292: checking to see if all hosts have failed and the running result is not ok 13271 1727203838.15293: done checking to see if all hosts have failed 13271 1727203838.15294: getting the remaining hosts for this loop 13271 1727203838.15294: done getting the remaining hosts for this loop 13271 1727203838.15296: getting the next task for host managed-node1 13271 1727203838.15298: done getting next task for host managed-node1 13271 1727203838.15299: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 13271 1727203838.15301: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203838.15303: getting variables 13271 1727203838.15303: in VariableManager get_vars() 13271 1727203838.15313: Calling all_inventory to load vars for managed-node1 13271 1727203838.15315: Calling groups_inventory to load vars for managed-node1 13271 1727203838.15317: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.15321: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.15322: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.15324: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.16467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.17989: done with get_vars() 13271 1727203838.18010: done getting variables 13271 1727203838.18051: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:50:38 -0400 (0:00:00.068) 0:00:21.824 ***** 13271 1727203838.18084: entering _queue_task() for managed-node1/set_fact 13271 1727203838.18399: worker is 1 (out of 1 available) 13271 1727203838.18411: exiting _queue_task() for managed-node1/set_fact 13271 1727203838.18423: done queuing things up, now waiting for results queue to drain 13271 1727203838.18425: waiting for pending results... 13271 1727203838.18798: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 13271 1727203838.18808: in run() - task 028d2410-947f-2a40-12ba-000000000440 13271 1727203838.18812: variable 'ansible_search_path' from source: unknown 13271 1727203838.18816: variable 'ansible_search_path' from source: unknown 13271 1727203838.18850: calling self._execute() 13271 1727203838.18945: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.18949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.18964: variable 'omit' from source: magic vars 13271 1727203838.19317: variable 'ansible_distribution_major_version' from source: facts 13271 1727203838.19326: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203838.19334: variable 'omit' from source: magic vars 13271 1727203838.19382: variable 'omit' from source: magic vars 13271 1727203838.19418: variable 'omit' from source: magic vars 13271 1727203838.19456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203838.19500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203838.19543: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203838.19546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.19549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.19583: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203838.19591: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.19594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.19764: Set connection var ansible_connection to ssh 13271 1727203838.19767: Set connection var ansible_shell_type to sh 13271 1727203838.19770: Set connection var ansible_timeout to 10 13271 1727203838.19772: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203838.19773: Set connection var ansible_pipelining to False 13271 1727203838.19777: Set connection var ansible_shell_executable to /bin/sh 13271 1727203838.19779: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.19781: variable 'ansible_connection' from source: unknown 13271 1727203838.19784: variable 'ansible_module_compression' from source: unknown 13271 1727203838.19790: variable 'ansible_shell_type' from source: unknown 13271 1727203838.19792: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.19795: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.19796: variable 'ansible_pipelining' from source: unknown 13271 1727203838.19798: variable 'ansible_timeout' from source: unknown 13271 1727203838.19800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.19890: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203838.19904: variable 'omit' from source: magic vars 13271 1727203838.19910: starting attempt loop 13271 1727203838.19913: running the handler 13271 1727203838.19981: handler run complete 13271 1727203838.19984: attempt loop complete, returning result 13271 1727203838.19986: _execute() done 13271 1727203838.19988: dumping result to json 13271 1727203838.19990: done dumping result, returning 13271 1727203838.19992: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [028d2410-947f-2a40-12ba-000000000440] 13271 1727203838.19994: sending task result for task 028d2410-947f-2a40-12ba-000000000440 13271 1727203838.20050: done sending task result for task 028d2410-947f-2a40-12ba-000000000440 13271 1727203838.20054: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 13271 1727203838.20115: no more pending results, returning what we have 13271 1727203838.20119: results queue empty 13271 1727203838.20120: checking for any_errors_fatal 13271 1727203838.20122: done checking for any_errors_fatal 13271 1727203838.20122: checking for max_fail_percentage 13271 1727203838.20124: done checking for max_fail_percentage 13271 1727203838.20125: checking to see if all hosts have failed and the running result is not ok 13271 1727203838.20126: done checking to see if all hosts have failed 13271 1727203838.20126: getting the remaining hosts for this loop 13271 1727203838.20128: done getting the remaining hosts for this loop 13271 1727203838.20131: getting the next task for host managed-node1 13271 1727203838.20139: done getting next task for host managed-node1 13271 1727203838.20141: ^ task is: TASK: Stat profile file 13271 1727203838.20145: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203838.20149: getting variables 13271 1727203838.20151: in VariableManager get_vars() 13271 1727203838.20196: Calling all_inventory to load vars for managed-node1 13271 1727203838.20199: Calling groups_inventory to load vars for managed-node1 13271 1727203838.20201: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.20214: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.20217: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.20220: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.21697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.23231: done with get_vars() 13271 1727203838.23251: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:50:38 -0400 (0:00:00.052) 0:00:21.876 ***** 13271 1727203838.23336: entering _queue_task() for managed-node1/stat 13271 1727203838.23668: worker is 1 (out of 1 available) 13271 1727203838.23882: exiting _queue_task() for managed-node1/stat 13271 1727203838.23893: done queuing things up, now waiting for results queue to drain 13271 1727203838.23895: waiting for pending results... 13271 1727203838.23996: running TaskExecutor() for managed-node1/TASK: Stat profile file 13271 1727203838.24095: in run() - task 028d2410-947f-2a40-12ba-000000000441 13271 1727203838.24119: variable 'ansible_search_path' from source: unknown 13271 1727203838.24123: variable 'ansible_search_path' from source: unknown 13271 1727203838.24184: calling self._execute() 13271 1727203838.24249: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.24256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.24267: variable 'omit' from source: magic vars 13271 1727203838.24638: variable 'ansible_distribution_major_version' from source: facts 13271 1727203838.24642: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203838.24645: variable 'omit' from source: magic vars 13271 1727203838.24727: variable 'omit' from source: magic vars 13271 1727203838.24795: variable 'profile' from source: include params 13271 1727203838.24798: variable 'item' from source: include params 13271 1727203838.24855: variable 'item' from source: include params 13271 1727203838.24874: variable 'omit' from source: magic vars 13271 1727203838.24917: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203838.25077: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203838.25080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203838.25083: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.25085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.25088: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203838.25090: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.25092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.25137: Set connection var ansible_connection to ssh 13271 1727203838.25145: Set connection var ansible_shell_type to sh 13271 1727203838.25160: Set connection var ansible_timeout to 10 13271 1727203838.25166: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203838.25168: Set connection var ansible_pipelining to False 13271 1727203838.25171: Set connection var ansible_shell_executable to /bin/sh 13271 1727203838.25196: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.25199: variable 'ansible_connection' from source: unknown 13271 1727203838.25201: variable 'ansible_module_compression' from source: unknown 13271 1727203838.25204: variable 'ansible_shell_type' from source: unknown 13271 1727203838.25206: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.25208: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.25217: variable 'ansible_pipelining' from source: unknown 13271 1727203838.25220: variable 'ansible_timeout' from source: unknown 13271 1727203838.25225: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.25428: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203838.25443: variable 'omit' from source: magic vars 13271 1727203838.25450: starting attempt loop 13271 1727203838.25453: running the handler 13271 1727203838.25491: _low_level_execute_command(): starting 13271 1727203838.25494: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203838.26212: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203838.26368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.26372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.26375: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.26382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.26502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.28258: stdout chunk (state=3): >>>/root <<< 13271 1727203838.28363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.28421: stderr chunk (state=3): >>><<< 13271 1727203838.28424: stdout chunk (state=3): >>><<< 13271 1727203838.28443: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203838.28463: _low_level_execute_command(): starting 13271 1727203838.28474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561 `" && echo ansible-tmp-1727203838.2844918-14720-68292860302561="` echo /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561 `" ) && sleep 0' 13271 1727203838.29052: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203838.29056: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.29158: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.29189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.29303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.31403: stdout chunk (state=3): >>>ansible-tmp-1727203838.2844918-14720-68292860302561=/root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561 <<< 13271 1727203838.31552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.31555: stdout chunk (state=3): >>><<< 13271 1727203838.31557: stderr chunk (state=3): >>><<< 13271 1727203838.31582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203838.2844918-14720-68292860302561=/root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203838.31749: variable 'ansible_module_compression' from source: unknown 13271 1727203838.31752: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 13271 1727203838.31755: variable 'ansible_facts' from source: unknown 13271 1727203838.31843: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/AnsiballZ_stat.py 13271 1727203838.32091: Sending initial data 13271 1727203838.32101: Sent initial data (152 bytes) 13271 1727203838.32604: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.32615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203838.32661: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.32715: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.32719: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.32810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.34548: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203838.34630: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203838.34720: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpsa461mzw /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/AnsiballZ_stat.py <<< 13271 1727203838.34723: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/AnsiballZ_stat.py" <<< 13271 1727203838.34814: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpsa461mzw" to remote "/root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/AnsiballZ_stat.py" <<< 13271 1727203838.35589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.35771: stderr chunk (state=3): >>><<< 13271 1727203838.35774: stdout chunk (state=3): >>><<< 13271 1727203838.35780: done transferring module to remote 13271 1727203838.35783: _low_level_execute_command(): starting 13271 1727203838.35785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/ /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/AnsiballZ_stat.py && sleep 0' 13271 1727203838.36238: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203838.36251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203838.36266: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.36309: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.36330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.36406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.38386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.38405: stderr chunk (state=3): >>><<< 13271 1727203838.38419: stdout chunk (state=3): >>><<< 13271 1727203838.38433: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203838.38441: _low_level_execute_command(): starting 13271 1727203838.38443: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/AnsiballZ_stat.py && sleep 0' 13271 1727203838.38849: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.38857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203838.38881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.38885: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.38935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.38938: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.39023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.55403: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 13271 1727203838.56915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203838.56972: stderr chunk (state=3): >>><<< 13271 1727203838.56978: stdout chunk (state=3): >>><<< 13271 1727203838.57001: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203838.57040: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203838.57057: _low_level_execute_command(): starting 13271 1727203838.57071: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203838.2844918-14720-68292860302561/ > /dev/null 2>&1 && sleep 0' 13271 1727203838.57882: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203838.57897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.57911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203838.57927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203838.57944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203838.57954: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203838.57970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.57991: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203838.58003: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203838.58013: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203838.58089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.58109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.58124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.58147: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.58258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.60324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.60337: stdout chunk (state=3): >>><<< 13271 1727203838.60347: stderr chunk (state=3): >>><<< 13271 1727203838.60397: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203838.60414: handler run complete 13271 1727203838.60468: attempt loop complete, returning result 13271 1727203838.60660: _execute() done 13271 1727203838.60666: dumping result to json 13271 1727203838.60669: done dumping result, returning 13271 1727203838.60671: done running TaskExecutor() for managed-node1/TASK: Stat profile file [028d2410-947f-2a40-12ba-000000000441] 13271 1727203838.60673: sending task result for task 028d2410-947f-2a40-12ba-000000000441 13271 1727203838.60748: done sending task result for task 028d2410-947f-2a40-12ba-000000000441 13271 1727203838.60752: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 13271 1727203838.60827: no more pending results, returning what we have 13271 1727203838.60831: results queue empty 13271 1727203838.60832: checking for any_errors_fatal 13271 1727203838.60840: done checking for any_errors_fatal 13271 1727203838.60841: checking for max_fail_percentage 13271 1727203838.60843: done checking for max_fail_percentage 13271 1727203838.60844: checking to see if all hosts have failed and the running result is not ok 13271 1727203838.60846: done checking to see if all hosts have failed 13271 1727203838.60846: getting the remaining hosts for this loop 13271 1727203838.60848: done getting the remaining hosts for this loop 13271 1727203838.60852: getting the next task for host managed-node1 13271 1727203838.60861: done getting next task for host managed-node1 13271 1727203838.60867: ^ task is: TASK: Set NM profile exist flag based on the profile files 13271 1727203838.60871: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203838.60875: getting variables 13271 1727203838.60881: in VariableManager get_vars() 13271 1727203838.60928: Calling all_inventory to load vars for managed-node1 13271 1727203838.60931: Calling groups_inventory to load vars for managed-node1 13271 1727203838.60934: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.60947: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.60950: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.60953: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.64454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.67410: done with get_vars() 13271 1727203838.67443: done getting variables 13271 1727203838.67516: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:50:38 -0400 (0:00:00.442) 0:00:22.318 ***** 13271 1727203838.67548: entering _queue_task() for managed-node1/set_fact 13271 1727203838.68023: worker is 1 (out of 1 available) 13271 1727203838.68035: exiting _queue_task() for managed-node1/set_fact 13271 1727203838.68047: done queuing things up, now waiting for results queue to drain 13271 1727203838.68049: waiting for pending results... 13271 1727203838.68396: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 13271 1727203838.68424: in run() - task 028d2410-947f-2a40-12ba-000000000442 13271 1727203838.68446: variable 'ansible_search_path' from source: unknown 13271 1727203838.68453: variable 'ansible_search_path' from source: unknown 13271 1727203838.68504: calling self._execute() 13271 1727203838.68631: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.68643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.68664: variable 'omit' from source: magic vars 13271 1727203838.69072: variable 'ansible_distribution_major_version' from source: facts 13271 1727203838.69144: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203838.69218: variable 'profile_stat' from source: set_fact 13271 1727203838.69234: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203838.69240: when evaluation is False, skipping this task 13271 1727203838.69249: _execute() done 13271 1727203838.69257: dumping result to json 13271 1727203838.69265: done dumping result, returning 13271 1727203838.69279: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [028d2410-947f-2a40-12ba-000000000442] 13271 1727203838.69288: sending task result for task 028d2410-947f-2a40-12ba-000000000442 13271 1727203838.69491: done sending task result for task 028d2410-947f-2a40-12ba-000000000442 13271 1727203838.69494: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203838.69544: no more pending results, returning what we have 13271 1727203838.69547: results queue empty 13271 1727203838.69548: checking for any_errors_fatal 13271 1727203838.69558: done checking for any_errors_fatal 13271 1727203838.69559: checking for max_fail_percentage 13271 1727203838.69560: done checking for max_fail_percentage 13271 1727203838.69564: checking to see if all hosts have failed and the running result is not ok 13271 1727203838.69565: done checking to see if all hosts have failed 13271 1727203838.69566: getting the remaining hosts for this loop 13271 1727203838.69568: done getting the remaining hosts for this loop 13271 1727203838.69577: getting the next task for host managed-node1 13271 1727203838.69585: done getting next task for host managed-node1 13271 1727203838.69589: ^ task is: TASK: Get NM profile info 13271 1727203838.69593: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203838.69597: getting variables 13271 1727203838.69599: in VariableManager get_vars() 13271 1727203838.69644: Calling all_inventory to load vars for managed-node1 13271 1727203838.69647: Calling groups_inventory to load vars for managed-node1 13271 1727203838.69649: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203838.69667: Calling all_plugins_play to load vars for managed-node1 13271 1727203838.69671: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203838.69782: Calling groups_plugins_play to load vars for managed-node1 13271 1727203838.71272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203838.73647: done with get_vars() 13271 1727203838.73679: done getting variables 13271 1727203838.73732: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:50:38 -0400 (0:00:00.062) 0:00:22.380 ***** 13271 1727203838.73768: entering _queue_task() for managed-node1/shell 13271 1727203838.74140: worker is 1 (out of 1 available) 13271 1727203838.74154: exiting _queue_task() for managed-node1/shell 13271 1727203838.74170: done queuing things up, now waiting for results queue to drain 13271 1727203838.74180: waiting for pending results... 13271 1727203838.74410: running TaskExecutor() for managed-node1/TASK: Get NM profile info 13271 1727203838.74684: in run() - task 028d2410-947f-2a40-12ba-000000000443 13271 1727203838.74698: variable 'ansible_search_path' from source: unknown 13271 1727203838.74703: variable 'ansible_search_path' from source: unknown 13271 1727203838.74840: calling self._execute() 13271 1727203838.75048: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.75053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.75056: variable 'omit' from source: magic vars 13271 1727203838.75352: variable 'ansible_distribution_major_version' from source: facts 13271 1727203838.75366: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203838.75374: variable 'omit' from source: magic vars 13271 1727203838.75429: variable 'omit' from source: magic vars 13271 1727203838.75523: variable 'profile' from source: include params 13271 1727203838.75526: variable 'item' from source: include params 13271 1727203838.75594: variable 'item' from source: include params 13271 1727203838.75612: variable 'omit' from source: magic vars 13271 1727203838.75704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203838.75708: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203838.75715: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203838.75733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.75745: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203838.75774: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203838.75780: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.75782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.75922: Set connection var ansible_connection to ssh 13271 1727203838.75925: Set connection var ansible_shell_type to sh 13271 1727203838.75928: Set connection var ansible_timeout to 10 13271 1727203838.75930: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203838.75932: Set connection var ansible_pipelining to False 13271 1727203838.75935: Set connection var ansible_shell_executable to /bin/sh 13271 1727203838.75944: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.75947: variable 'ansible_connection' from source: unknown 13271 1727203838.75949: variable 'ansible_module_compression' from source: unknown 13271 1727203838.75951: variable 'ansible_shell_type' from source: unknown 13271 1727203838.75953: variable 'ansible_shell_executable' from source: unknown 13271 1727203838.75955: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203838.75957: variable 'ansible_pipelining' from source: unknown 13271 1727203838.75960: variable 'ansible_timeout' from source: unknown 13271 1727203838.75965: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203838.76167: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203838.76171: variable 'omit' from source: magic vars 13271 1727203838.76173: starting attempt loop 13271 1727203838.76178: running the handler 13271 1727203838.76181: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203838.76184: _low_level_execute_command(): starting 13271 1727203838.76186: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203838.76989: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203838.77095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.77098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203838.77101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203838.77103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203838.77105: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203838.77108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.77150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.77167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.77185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.77307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.79096: stdout chunk (state=3): >>>/root <<< 13271 1727203838.79181: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.79333: stderr chunk (state=3): >>><<< 13271 1727203838.79336: stdout chunk (state=3): >>><<< 13271 1727203838.79340: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203838.79342: _low_level_execute_command(): starting 13271 1727203838.79345: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564 `" && echo ansible-tmp-1727203838.7924335-14744-103683802714564="` echo /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564 `" ) && sleep 0' 13271 1727203838.79937: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.79982: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.80107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.82223: stdout chunk (state=3): >>>ansible-tmp-1727203838.7924335-14744-103683802714564=/root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564 <<< 13271 1727203838.82368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.82384: stderr chunk (state=3): >>><<< 13271 1727203838.82388: stdout chunk (state=3): >>><<< 13271 1727203838.82449: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203838.7924335-14744-103683802714564=/root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203838.82452: variable 'ansible_module_compression' from source: unknown 13271 1727203838.82515: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203838.82542: variable 'ansible_facts' from source: unknown 13271 1727203838.82897: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/AnsiballZ_command.py 13271 1727203838.83241: Sending initial data 13271 1727203838.83244: Sent initial data (156 bytes) 13271 1727203838.84158: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.84237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.84243: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.84313: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.84501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.86472: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203838.86544: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203838.86670: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpkw7exzi1 /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/AnsiballZ_command.py <<< 13271 1727203838.86674: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/AnsiballZ_command.py" <<< 13271 1727203838.86731: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpkw7exzi1" to remote "/root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/AnsiballZ_command.py" <<< 13271 1727203838.88225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.88243: stderr chunk (state=3): >>><<< 13271 1727203838.88246: stdout chunk (state=3): >>><<< 13271 1727203838.88481: done transferring module to remote 13271 1727203838.88485: _low_level_execute_command(): starting 13271 1727203838.88488: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/ /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/AnsiballZ_command.py && sleep 0' 13271 1727203838.89562: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203838.89580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.89599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203838.89779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.89833: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203838.89851: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.89889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.90040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203838.92054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203838.92065: stdout chunk (state=3): >>><<< 13271 1727203838.92083: stderr chunk (state=3): >>><<< 13271 1727203838.92105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203838.92114: _low_level_execute_command(): starting 13271 1727203838.92124: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/AnsiballZ_command.py && sleep 0' 13271 1727203838.92718: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203838.92736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.92750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203838.92770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203838.92792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203838.92805: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203838.92819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203838.92839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203838.92852: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203838.92864: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203838.92880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203838.92967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203838.92983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203838.93107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203839.11713: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:50:39.092340", "end": "2024-09-24 14:50:39.114106", "delta": "0:00:00.021766", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203839.13365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203839.13369: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 13271 1727203839.13425: stderr chunk (state=3): >>><<< 13271 1727203839.13440: stdout chunk (state=3): >>><<< 13271 1727203839.13464: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:50:39.092340", "end": "2024-09-24 14:50:39.114106", "delta": "0:00:00.021766", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203839.13502: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203839.13510: _low_level_execute_command(): starting 13271 1727203839.13515: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203838.7924335-14744-103683802714564/ > /dev/null 2>&1 && sleep 0' 13271 1727203839.14122: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203839.14136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203839.14380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203839.14384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203839.14386: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203839.14391: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203839.14393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203839.14395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203839.14397: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203839.14399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203839.14401: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203839.14403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203839.14405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203839.14407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203839.14409: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203839.14411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203839.14413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203839.14485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203839.16478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203839.16482: stdout chunk (state=3): >>><<< 13271 1727203839.16491: stderr chunk (state=3): >>><<< 13271 1727203839.16503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203839.16509: handler run complete 13271 1727203839.16528: Evaluated conditional (False): False 13271 1727203839.16537: attempt loop complete, returning result 13271 1727203839.16540: _execute() done 13271 1727203839.16543: dumping result to json 13271 1727203839.16549: done dumping result, returning 13271 1727203839.16555: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [028d2410-947f-2a40-12ba-000000000443] 13271 1727203839.16559: sending task result for task 028d2410-947f-2a40-12ba-000000000443 13271 1727203839.16649: done sending task result for task 028d2410-947f-2a40-12ba-000000000443 13271 1727203839.16653: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.021766", "end": "2024-09-24 14:50:39.114106", "rc": 0, "start": "2024-09-24 14:50:39.092340" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 13271 1727203839.16726: no more pending results, returning what we have 13271 1727203839.16729: results queue empty 13271 1727203839.16730: checking for any_errors_fatal 13271 1727203839.16735: done checking for any_errors_fatal 13271 1727203839.16736: checking for max_fail_percentage 13271 1727203839.16738: done checking for max_fail_percentage 13271 1727203839.16739: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.16740: done checking to see if all hosts have failed 13271 1727203839.16740: getting the remaining hosts for this loop 13271 1727203839.16742: done getting the remaining hosts for this loop 13271 1727203839.16745: getting the next task for host managed-node1 13271 1727203839.16752: done getting next task for host managed-node1 13271 1727203839.16754: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13271 1727203839.16758: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.16769: getting variables 13271 1727203839.16771: in VariableManager get_vars() 13271 1727203839.16830: Calling all_inventory to load vars for managed-node1 13271 1727203839.16832: Calling groups_inventory to load vars for managed-node1 13271 1727203839.16834: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.16845: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.16847: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.16850: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.18238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.19565: done with get_vars() 13271 1727203839.19587: done getting variables 13271 1727203839.19632: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.458) 0:00:22.839 ***** 13271 1727203839.19655: entering _queue_task() for managed-node1/set_fact 13271 1727203839.19904: worker is 1 (out of 1 available) 13271 1727203839.19917: exiting _queue_task() for managed-node1/set_fact 13271 1727203839.19928: done queuing things up, now waiting for results queue to drain 13271 1727203839.19931: waiting for pending results... 13271 1727203839.20105: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 13271 1727203839.20278: in run() - task 028d2410-947f-2a40-12ba-000000000444 13271 1727203839.20283: variable 'ansible_search_path' from source: unknown 13271 1727203839.20285: variable 'ansible_search_path' from source: unknown 13271 1727203839.20288: calling self._execute() 13271 1727203839.20338: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.20342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.20356: variable 'omit' from source: magic vars 13271 1727203839.20885: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.20889: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.20891: variable 'nm_profile_exists' from source: set_fact 13271 1727203839.20894: Evaluated conditional (nm_profile_exists.rc == 0): True 13271 1727203839.20896: variable 'omit' from source: magic vars 13271 1727203839.20921: variable 'omit' from source: magic vars 13271 1727203839.20952: variable 'omit' from source: magic vars 13271 1727203839.20999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203839.21027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203839.21081: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203839.21095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.21107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.21136: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203839.21139: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.21142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.21239: Set connection var ansible_connection to ssh 13271 1727203839.21246: Set connection var ansible_shell_type to sh 13271 1727203839.21254: Set connection var ansible_timeout to 10 13271 1727203839.21260: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203839.21268: Set connection var ansible_pipelining to False 13271 1727203839.21274: Set connection var ansible_shell_executable to /bin/sh 13271 1727203839.21623: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.21626: variable 'ansible_connection' from source: unknown 13271 1727203839.21628: variable 'ansible_module_compression' from source: unknown 13271 1727203839.21631: variable 'ansible_shell_type' from source: unknown 13271 1727203839.21633: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.21635: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.21637: variable 'ansible_pipelining' from source: unknown 13271 1727203839.21640: variable 'ansible_timeout' from source: unknown 13271 1727203839.21650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.21929: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203839.21941: variable 'omit' from source: magic vars 13271 1727203839.21947: starting attempt loop 13271 1727203839.21950: running the handler 13271 1727203839.21961: handler run complete 13271 1727203839.22083: attempt loop complete, returning result 13271 1727203839.22087: _execute() done 13271 1727203839.22089: dumping result to json 13271 1727203839.22091: done dumping result, returning 13271 1727203839.22094: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [028d2410-947f-2a40-12ba-000000000444] 13271 1727203839.22096: sending task result for task 028d2410-947f-2a40-12ba-000000000444 13271 1727203839.22160: done sending task result for task 028d2410-947f-2a40-12ba-000000000444 13271 1727203839.22163: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 13271 1727203839.22234: no more pending results, returning what we have 13271 1727203839.22237: results queue empty 13271 1727203839.22238: checking for any_errors_fatal 13271 1727203839.22246: done checking for any_errors_fatal 13271 1727203839.22247: checking for max_fail_percentage 13271 1727203839.22248: done checking for max_fail_percentage 13271 1727203839.22249: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.22250: done checking to see if all hosts have failed 13271 1727203839.22251: getting the remaining hosts for this loop 13271 1727203839.22252: done getting the remaining hosts for this loop 13271 1727203839.22257: getting the next task for host managed-node1 13271 1727203839.22266: done getting next task for host managed-node1 13271 1727203839.22268: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 13271 1727203839.22378: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.22386: getting variables 13271 1727203839.22387: in VariableManager get_vars() 13271 1727203839.22423: Calling all_inventory to load vars for managed-node1 13271 1727203839.22425: Calling groups_inventory to load vars for managed-node1 13271 1727203839.22427: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.22436: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.22439: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.22441: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.24127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.25941: done with get_vars() 13271 1727203839.25970: done getting variables 13271 1727203839.26037: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203839.26178: variable 'profile' from source: include params 13271 1727203839.26182: variable 'item' from source: include params 13271 1727203839.26245: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.066) 0:00:22.906 ***** 13271 1727203839.26287: entering _queue_task() for managed-node1/command 13271 1727203839.26659: worker is 1 (out of 1 available) 13271 1727203839.26788: exiting _queue_task() for managed-node1/command 13271 1727203839.26804: done queuing things up, now waiting for results queue to drain 13271 1727203839.26806: waiting for pending results... 13271 1727203839.27096: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 13271 1727203839.27141: in run() - task 028d2410-947f-2a40-12ba-000000000446 13271 1727203839.27160: variable 'ansible_search_path' from source: unknown 13271 1727203839.27193: variable 'ansible_search_path' from source: unknown 13271 1727203839.27298: calling self._execute() 13271 1727203839.27335: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.27348: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.27369: variable 'omit' from source: magic vars 13271 1727203839.27759: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.27782: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.27911: variable 'profile_stat' from source: set_fact 13271 1727203839.27928: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203839.27936: when evaluation is False, skipping this task 13271 1727203839.27958: _execute() done 13271 1727203839.27960: dumping result to json 13271 1727203839.27965: done dumping result, returning 13271 1727203839.27968: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [028d2410-947f-2a40-12ba-000000000446] 13271 1727203839.28083: sending task result for task 028d2410-947f-2a40-12ba-000000000446 13271 1727203839.28145: done sending task result for task 028d2410-947f-2a40-12ba-000000000446 13271 1727203839.28147: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203839.28212: no more pending results, returning what we have 13271 1727203839.28216: results queue empty 13271 1727203839.28217: checking for any_errors_fatal 13271 1727203839.28223: done checking for any_errors_fatal 13271 1727203839.28224: checking for max_fail_percentage 13271 1727203839.28226: done checking for max_fail_percentage 13271 1727203839.28227: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.28228: done checking to see if all hosts have failed 13271 1727203839.28229: getting the remaining hosts for this loop 13271 1727203839.28231: done getting the remaining hosts for this loop 13271 1727203839.28234: getting the next task for host managed-node1 13271 1727203839.28241: done getting next task for host managed-node1 13271 1727203839.28244: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 13271 1727203839.28248: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.28252: getting variables 13271 1727203839.28255: in VariableManager get_vars() 13271 1727203839.28390: Calling all_inventory to load vars for managed-node1 13271 1727203839.28393: Calling groups_inventory to load vars for managed-node1 13271 1727203839.28395: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.28521: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.28524: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.28530: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.29899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.31614: done with get_vars() 13271 1727203839.31638: done getting variables 13271 1727203839.31709: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203839.31837: variable 'profile' from source: include params 13271 1727203839.31842: variable 'item' from source: include params 13271 1727203839.31906: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.056) 0:00:22.962 ***** 13271 1727203839.31951: entering _queue_task() for managed-node1/set_fact 13271 1727203839.32321: worker is 1 (out of 1 available) 13271 1727203839.32333: exiting _queue_task() for managed-node1/set_fact 13271 1727203839.32345: done queuing things up, now waiting for results queue to drain 13271 1727203839.32347: waiting for pending results... 13271 1727203839.32707: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 13271 1727203839.32809: in run() - task 028d2410-947f-2a40-12ba-000000000447 13271 1727203839.32813: variable 'ansible_search_path' from source: unknown 13271 1727203839.32816: variable 'ansible_search_path' from source: unknown 13271 1727203839.32819: calling self._execute() 13271 1727203839.32933: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.32938: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.32951: variable 'omit' from source: magic vars 13271 1727203839.33567: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.33582: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.33781: variable 'profile_stat' from source: set_fact 13271 1727203839.33788: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203839.33790: when evaluation is False, skipping this task 13271 1727203839.33792: _execute() done 13271 1727203839.33795: dumping result to json 13271 1727203839.33798: done dumping result, returning 13271 1727203839.33804: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [028d2410-947f-2a40-12ba-000000000447] 13271 1727203839.33806: sending task result for task 028d2410-947f-2a40-12ba-000000000447 13271 1727203839.33861: done sending task result for task 028d2410-947f-2a40-12ba-000000000447 13271 1727203839.33864: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203839.33959: no more pending results, returning what we have 13271 1727203839.33965: results queue empty 13271 1727203839.33967: checking for any_errors_fatal 13271 1727203839.33972: done checking for any_errors_fatal 13271 1727203839.33972: checking for max_fail_percentage 13271 1727203839.33974: done checking for max_fail_percentage 13271 1727203839.33977: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.33978: done checking to see if all hosts have failed 13271 1727203839.33978: getting the remaining hosts for this loop 13271 1727203839.33980: done getting the remaining hosts for this loop 13271 1727203839.33984: getting the next task for host managed-node1 13271 1727203839.33992: done getting next task for host managed-node1 13271 1727203839.33994: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 13271 1727203839.33999: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.34003: getting variables 13271 1727203839.34005: in VariableManager get_vars() 13271 1727203839.34046: Calling all_inventory to load vars for managed-node1 13271 1727203839.34049: Calling groups_inventory to load vars for managed-node1 13271 1727203839.34051: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.34068: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.34071: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.34074: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.40065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.41626: done with get_vars() 13271 1727203839.41652: done getting variables 13271 1727203839.41708: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203839.41800: variable 'profile' from source: include params 13271 1727203839.41803: variable 'item' from source: include params 13271 1727203839.41865: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.099) 0:00:23.062 ***** 13271 1727203839.41893: entering _queue_task() for managed-node1/command 13271 1727203839.42229: worker is 1 (out of 1 available) 13271 1727203839.42241: exiting _queue_task() for managed-node1/command 13271 1727203839.42255: done queuing things up, now waiting for results queue to drain 13271 1727203839.42257: waiting for pending results... 13271 1727203839.42606: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 13271 1727203839.42662: in run() - task 028d2410-947f-2a40-12ba-000000000448 13271 1727203839.42684: variable 'ansible_search_path' from source: unknown 13271 1727203839.42691: variable 'ansible_search_path' from source: unknown 13271 1727203839.42740: calling self._execute() 13271 1727203839.42880: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.42883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.42886: variable 'omit' from source: magic vars 13271 1727203839.43254: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.43271: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.43399: variable 'profile_stat' from source: set_fact 13271 1727203839.43415: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203839.43463: when evaluation is False, skipping this task 13271 1727203839.43467: _execute() done 13271 1727203839.43469: dumping result to json 13271 1727203839.43472: done dumping result, returning 13271 1727203839.43474: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [028d2410-947f-2a40-12ba-000000000448] 13271 1727203839.43478: sending task result for task 028d2410-947f-2a40-12ba-000000000448 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203839.43730: no more pending results, returning what we have 13271 1727203839.43733: results queue empty 13271 1727203839.43734: checking for any_errors_fatal 13271 1727203839.43740: done checking for any_errors_fatal 13271 1727203839.43741: checking for max_fail_percentage 13271 1727203839.43743: done checking for max_fail_percentage 13271 1727203839.43744: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.43745: done checking to see if all hosts have failed 13271 1727203839.43746: getting the remaining hosts for this loop 13271 1727203839.43747: done getting the remaining hosts for this loop 13271 1727203839.43751: getting the next task for host managed-node1 13271 1727203839.43759: done getting next task for host managed-node1 13271 1727203839.43762: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 13271 1727203839.43765: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.43770: getting variables 13271 1727203839.43772: in VariableManager get_vars() 13271 1727203839.43815: Calling all_inventory to load vars for managed-node1 13271 1727203839.43818: Calling groups_inventory to load vars for managed-node1 13271 1727203839.43821: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.43835: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.43838: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.43841: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.44391: done sending task result for task 028d2410-947f-2a40-12ba-000000000448 13271 1727203839.44394: WORKER PROCESS EXITING 13271 1727203839.45340: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.46950: done with get_vars() 13271 1727203839.46979: done getting variables 13271 1727203839.47037: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203839.47149: variable 'profile' from source: include params 13271 1727203839.47153: variable 'item' from source: include params 13271 1727203839.47215: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.053) 0:00:23.115 ***** 13271 1727203839.47245: entering _queue_task() for managed-node1/set_fact 13271 1727203839.47556: worker is 1 (out of 1 available) 13271 1727203839.47567: exiting _queue_task() for managed-node1/set_fact 13271 1727203839.47580: done queuing things up, now waiting for results queue to drain 13271 1727203839.47581: waiting for pending results... 13271 1727203839.47863: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 13271 1727203839.47984: in run() - task 028d2410-947f-2a40-12ba-000000000449 13271 1727203839.48003: variable 'ansible_search_path' from source: unknown 13271 1727203839.48010: variable 'ansible_search_path' from source: unknown 13271 1727203839.48055: calling self._execute() 13271 1727203839.48162: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.48176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.48196: variable 'omit' from source: magic vars 13271 1727203839.48573: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.48599: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.48726: variable 'profile_stat' from source: set_fact 13271 1727203839.48746: Evaluated conditional (profile_stat.stat.exists): False 13271 1727203839.48755: when evaluation is False, skipping this task 13271 1727203839.48763: _execute() done 13271 1727203839.48771: dumping result to json 13271 1727203839.48781: done dumping result, returning 13271 1727203839.48792: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [028d2410-947f-2a40-12ba-000000000449] 13271 1727203839.48807: sending task result for task 028d2410-947f-2a40-12ba-000000000449 13271 1727203839.48982: done sending task result for task 028d2410-947f-2a40-12ba-000000000449 13271 1727203839.48986: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 13271 1727203839.49034: no more pending results, returning what we have 13271 1727203839.49038: results queue empty 13271 1727203839.49039: checking for any_errors_fatal 13271 1727203839.49044: done checking for any_errors_fatal 13271 1727203839.49044: checking for max_fail_percentage 13271 1727203839.49046: done checking for max_fail_percentage 13271 1727203839.49047: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.49048: done checking to see if all hosts have failed 13271 1727203839.49048: getting the remaining hosts for this loop 13271 1727203839.49050: done getting the remaining hosts for this loop 13271 1727203839.49053: getting the next task for host managed-node1 13271 1727203839.49062: done getting next task for host managed-node1 13271 1727203839.49065: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 13271 1727203839.49068: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.49073: getting variables 13271 1727203839.49075: in VariableManager get_vars() 13271 1727203839.49226: Calling all_inventory to load vars for managed-node1 13271 1727203839.49229: Calling groups_inventory to load vars for managed-node1 13271 1727203839.49232: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.49245: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.49248: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.49251: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.50916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.52821: done with get_vars() 13271 1727203839.52844: done getting variables 13271 1727203839.52909: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203839.53051: variable 'profile' from source: include params 13271 1727203839.53056: variable 'item' from source: include params 13271 1727203839.53117: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.058) 0:00:23.174 ***** 13271 1727203839.53147: entering _queue_task() for managed-node1/assert 13271 1727203839.53705: worker is 1 (out of 1 available) 13271 1727203839.53714: exiting _queue_task() for managed-node1/assert 13271 1727203839.53724: done queuing things up, now waiting for results queue to drain 13271 1727203839.53726: waiting for pending results... 13271 1727203839.53868: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' 13271 1727203839.54181: in run() - task 028d2410-947f-2a40-12ba-00000000026e 13271 1727203839.54185: variable 'ansible_search_path' from source: unknown 13271 1727203839.54188: variable 'ansible_search_path' from source: unknown 13271 1727203839.54191: calling self._execute() 13271 1727203839.54193: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.54195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.54199: variable 'omit' from source: magic vars 13271 1727203839.54543: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.54554: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.54560: variable 'omit' from source: magic vars 13271 1727203839.54611: variable 'omit' from source: magic vars 13271 1727203839.54721: variable 'profile' from source: include params 13271 1727203839.54724: variable 'item' from source: include params 13271 1727203839.54792: variable 'item' from source: include params 13271 1727203839.54812: variable 'omit' from source: magic vars 13271 1727203839.54860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203839.54899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203839.54921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203839.54946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.54959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.54993: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203839.54997: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.54999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.55101: Set connection var ansible_connection to ssh 13271 1727203839.55108: Set connection var ansible_shell_type to sh 13271 1727203839.55116: Set connection var ansible_timeout to 10 13271 1727203839.55121: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203839.55127: Set connection var ansible_pipelining to False 13271 1727203839.55132: Set connection var ansible_shell_executable to /bin/sh 13271 1727203839.55165: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.55168: variable 'ansible_connection' from source: unknown 13271 1727203839.55171: variable 'ansible_module_compression' from source: unknown 13271 1727203839.55178: variable 'ansible_shell_type' from source: unknown 13271 1727203839.55183: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.55186: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.55188: variable 'ansible_pipelining' from source: unknown 13271 1727203839.55191: variable 'ansible_timeout' from source: unknown 13271 1727203839.55193: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.55498: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203839.55509: variable 'omit' from source: magic vars 13271 1727203839.55514: starting attempt loop 13271 1727203839.55517: running the handler 13271 1727203839.55984: variable 'lsr_net_profile_exists' from source: set_fact 13271 1727203839.55986: Evaluated conditional (lsr_net_profile_exists): True 13271 1727203839.55988: handler run complete 13271 1727203839.55990: attempt loop complete, returning result 13271 1727203839.55991: _execute() done 13271 1727203839.55993: dumping result to json 13271 1727203839.55995: done dumping result, returning 13271 1727203839.55997: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' [028d2410-947f-2a40-12ba-00000000026e] 13271 1727203839.55998: sending task result for task 028d2410-947f-2a40-12ba-00000000026e ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203839.56137: no more pending results, returning what we have 13271 1727203839.56141: results queue empty 13271 1727203839.56142: checking for any_errors_fatal 13271 1727203839.56150: done checking for any_errors_fatal 13271 1727203839.56150: checking for max_fail_percentage 13271 1727203839.56152: done checking for max_fail_percentage 13271 1727203839.56153: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.56154: done checking to see if all hosts have failed 13271 1727203839.56155: getting the remaining hosts for this loop 13271 1727203839.56156: done getting the remaining hosts for this loop 13271 1727203839.56160: getting the next task for host managed-node1 13271 1727203839.56167: done getting next task for host managed-node1 13271 1727203839.56170: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 13271 1727203839.56173: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.56180: getting variables 13271 1727203839.56182: in VariableManager get_vars() 13271 1727203839.56229: Calling all_inventory to load vars for managed-node1 13271 1727203839.56232: Calling groups_inventory to load vars for managed-node1 13271 1727203839.56235: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.56250: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.56253: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.56259: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.56781: done sending task result for task 028d2410-947f-2a40-12ba-00000000026e 13271 1727203839.56785: WORKER PROCESS EXITING 13271 1727203839.59506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.62696: done with get_vars() 13271 1727203839.62722: done getting variables 13271 1727203839.62985: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203839.63095: variable 'profile' from source: include params 13271 1727203839.63099: variable 'item' from source: include params 13271 1727203839.63148: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.100) 0:00:23.274 ***** 13271 1727203839.63386: entering _queue_task() for managed-node1/assert 13271 1727203839.64038: worker is 1 (out of 1 available) 13271 1727203839.64052: exiting _queue_task() for managed-node1/assert 13271 1727203839.64064: done queuing things up, now waiting for results queue to drain 13271 1727203839.64066: waiting for pending results... 13271 1727203839.64373: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 13271 1727203839.64583: in run() - task 028d2410-947f-2a40-12ba-00000000026f 13271 1727203839.64587: variable 'ansible_search_path' from source: unknown 13271 1727203839.64591: variable 'ansible_search_path' from source: unknown 13271 1727203839.64594: calling self._execute() 13271 1727203839.64666: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.64670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.64683: variable 'omit' from source: magic vars 13271 1727203839.65279: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.65284: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.65287: variable 'omit' from source: magic vars 13271 1727203839.65290: variable 'omit' from source: magic vars 13271 1727203839.65292: variable 'profile' from source: include params 13271 1727203839.65294: variable 'item' from source: include params 13271 1727203839.65312: variable 'item' from source: include params 13271 1727203839.65331: variable 'omit' from source: magic vars 13271 1727203839.65379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203839.65419: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203839.65440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203839.65458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.65471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.65510: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203839.65513: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.65515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.65781: Set connection var ansible_connection to ssh 13271 1727203839.65784: Set connection var ansible_shell_type to sh 13271 1727203839.65786: Set connection var ansible_timeout to 10 13271 1727203839.65789: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203839.65791: Set connection var ansible_pipelining to False 13271 1727203839.65793: Set connection var ansible_shell_executable to /bin/sh 13271 1727203839.65795: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.65797: variable 'ansible_connection' from source: unknown 13271 1727203839.65799: variable 'ansible_module_compression' from source: unknown 13271 1727203839.65801: variable 'ansible_shell_type' from source: unknown 13271 1727203839.65803: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.65805: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.65808: variable 'ansible_pipelining' from source: unknown 13271 1727203839.65810: variable 'ansible_timeout' from source: unknown 13271 1727203839.65816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.65855: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203839.65868: variable 'omit' from source: magic vars 13271 1727203839.65874: starting attempt loop 13271 1727203839.65879: running the handler 13271 1727203839.65990: variable 'lsr_net_profile_ansible_managed' from source: set_fact 13271 1727203839.65994: Evaluated conditional (lsr_net_profile_ansible_managed): True 13271 1727203839.66000: handler run complete 13271 1727203839.66293: attempt loop complete, returning result 13271 1727203839.66296: _execute() done 13271 1727203839.66298: dumping result to json 13271 1727203839.66301: done dumping result, returning 13271 1727203839.66303: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [028d2410-947f-2a40-12ba-00000000026f] 13271 1727203839.66305: sending task result for task 028d2410-947f-2a40-12ba-00000000026f 13271 1727203839.66359: done sending task result for task 028d2410-947f-2a40-12ba-00000000026f 13271 1727203839.66366: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203839.66409: no more pending results, returning what we have 13271 1727203839.66412: results queue empty 13271 1727203839.66413: checking for any_errors_fatal 13271 1727203839.66418: done checking for any_errors_fatal 13271 1727203839.66419: checking for max_fail_percentage 13271 1727203839.66421: done checking for max_fail_percentage 13271 1727203839.66422: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.66423: done checking to see if all hosts have failed 13271 1727203839.66423: getting the remaining hosts for this loop 13271 1727203839.66425: done getting the remaining hosts for this loop 13271 1727203839.66428: getting the next task for host managed-node1 13271 1727203839.66433: done getting next task for host managed-node1 13271 1727203839.66436: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 13271 1727203839.66438: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.66442: getting variables 13271 1727203839.66444: in VariableManager get_vars() 13271 1727203839.66531: Calling all_inventory to load vars for managed-node1 13271 1727203839.66534: Calling groups_inventory to load vars for managed-node1 13271 1727203839.66537: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.66546: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.66549: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.66552: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.68284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.70026: done with get_vars() 13271 1727203839.70050: done getting variables 13271 1727203839.70115: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203839.70231: variable 'profile' from source: include params 13271 1727203839.70235: variable 'item' from source: include params 13271 1727203839.70300: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.071) 0:00:23.346 ***** 13271 1727203839.70335: entering _queue_task() for managed-node1/assert 13271 1727203839.70767: worker is 1 (out of 1 available) 13271 1727203839.70783: exiting _queue_task() for managed-node1/assert 13271 1727203839.70810: done queuing things up, now waiting for results queue to drain 13271 1727203839.70812: waiting for pending results... 13271 1727203839.71086: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 13271 1727203839.71209: in run() - task 028d2410-947f-2a40-12ba-000000000270 13271 1727203839.71232: variable 'ansible_search_path' from source: unknown 13271 1727203839.71240: variable 'ansible_search_path' from source: unknown 13271 1727203839.71291: calling self._execute() 13271 1727203839.71397: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.71408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.71423: variable 'omit' from source: magic vars 13271 1727203839.71799: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.71826: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.71935: variable 'omit' from source: magic vars 13271 1727203839.71938: variable 'omit' from source: magic vars 13271 1727203839.71987: variable 'profile' from source: include params 13271 1727203839.71998: variable 'item' from source: include params 13271 1727203839.72078: variable 'item' from source: include params 13271 1727203839.72119: variable 'omit' from source: magic vars 13271 1727203839.72169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203839.72227: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203839.72272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203839.72301: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.72372: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.72378: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203839.72381: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.72383: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.72452: Set connection var ansible_connection to ssh 13271 1727203839.72494: Set connection var ansible_shell_type to sh 13271 1727203839.72509: Set connection var ansible_timeout to 10 13271 1727203839.72546: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203839.72549: Set connection var ansible_pipelining to False 13271 1727203839.72555: Set connection var ansible_shell_executable to /bin/sh 13271 1727203839.72679: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.72683: variable 'ansible_connection' from source: unknown 13271 1727203839.72686: variable 'ansible_module_compression' from source: unknown 13271 1727203839.72689: variable 'ansible_shell_type' from source: unknown 13271 1727203839.72692: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.72695: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.72697: variable 'ansible_pipelining' from source: unknown 13271 1727203839.72700: variable 'ansible_timeout' from source: unknown 13271 1727203839.72703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.72821: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203839.72825: variable 'omit' from source: magic vars 13271 1727203839.72827: starting attempt loop 13271 1727203839.72829: running the handler 13271 1727203839.72881: variable 'lsr_net_profile_fingerprint' from source: set_fact 13271 1727203839.72885: Evaluated conditional (lsr_net_profile_fingerprint): True 13271 1727203839.72897: handler run complete 13271 1727203839.72912: attempt loop complete, returning result 13271 1727203839.72915: _execute() done 13271 1727203839.72918: dumping result to json 13271 1727203839.72930: done dumping result, returning 13271 1727203839.72933: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 [028d2410-947f-2a40-12ba-000000000270] 13271 1727203839.72935: sending task result for task 028d2410-947f-2a40-12ba-000000000270 13271 1727203839.73018: done sending task result for task 028d2410-947f-2a40-12ba-000000000270 13271 1727203839.73021: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 13271 1727203839.73080: no more pending results, returning what we have 13271 1727203839.73083: results queue empty 13271 1727203839.73083: checking for any_errors_fatal 13271 1727203839.73089: done checking for any_errors_fatal 13271 1727203839.73090: checking for max_fail_percentage 13271 1727203839.73091: done checking for max_fail_percentage 13271 1727203839.73092: checking to see if all hosts have failed and the running result is not ok 13271 1727203839.73093: done checking to see if all hosts have failed 13271 1727203839.73094: getting the remaining hosts for this loop 13271 1727203839.73095: done getting the remaining hosts for this loop 13271 1727203839.73098: getting the next task for host managed-node1 13271 1727203839.73105: done getting next task for host managed-node1 13271 1727203839.73109: ^ task is: TASK: ** TEST check polling interval 13271 1727203839.73111: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203839.73115: getting variables 13271 1727203839.73117: in VariableManager get_vars() 13271 1727203839.73156: Calling all_inventory to load vars for managed-node1 13271 1727203839.73159: Calling groups_inventory to load vars for managed-node1 13271 1727203839.73163: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203839.73174: Calling all_plugins_play to load vars for managed-node1 13271 1727203839.73179: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203839.73182: Calling groups_plugins_play to load vars for managed-node1 13271 1727203839.75210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203839.77284: done with get_vars() 13271 1727203839.77310: done getting variables 13271 1727203839.77408: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Tuesday 24 September 2024 14:50:39 -0400 (0:00:00.070) 0:00:23.417 ***** 13271 1727203839.77436: entering _queue_task() for managed-node1/command 13271 1727203839.77924: worker is 1 (out of 1 available) 13271 1727203839.77937: exiting _queue_task() for managed-node1/command 13271 1727203839.77953: done queuing things up, now waiting for results queue to drain 13271 1727203839.77955: waiting for pending results... 13271 1727203839.78513: running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval 13271 1727203839.78745: in run() - task 028d2410-947f-2a40-12ba-000000000071 13271 1727203839.78749: variable 'ansible_search_path' from source: unknown 13271 1727203839.78752: calling self._execute() 13271 1727203839.79019: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.79024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.79026: variable 'omit' from source: magic vars 13271 1727203839.79423: variable 'ansible_distribution_major_version' from source: facts 13271 1727203839.79440: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203839.79451: variable 'omit' from source: magic vars 13271 1727203839.79483: variable 'omit' from source: magic vars 13271 1727203839.79589: variable 'controller_device' from source: play vars 13271 1727203839.79616: variable 'omit' from source: magic vars 13271 1727203839.79667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203839.79711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203839.79746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203839.79773: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.79793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203839.79827: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203839.79840: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.79852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.79954: Set connection var ansible_connection to ssh 13271 1727203839.80026: Set connection var ansible_shell_type to sh 13271 1727203839.80029: Set connection var ansible_timeout to 10 13271 1727203839.80031: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203839.80033: Set connection var ansible_pipelining to False 13271 1727203839.80035: Set connection var ansible_shell_executable to /bin/sh 13271 1727203839.80037: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.80044: variable 'ansible_connection' from source: unknown 13271 1727203839.80056: variable 'ansible_module_compression' from source: unknown 13271 1727203839.80069: variable 'ansible_shell_type' from source: unknown 13271 1727203839.80163: variable 'ansible_shell_executable' from source: unknown 13271 1727203839.80167: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203839.80169: variable 'ansible_pipelining' from source: unknown 13271 1727203839.80173: variable 'ansible_timeout' from source: unknown 13271 1727203839.80177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203839.80252: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203839.80338: variable 'omit' from source: magic vars 13271 1727203839.80353: starting attempt loop 13271 1727203839.80370: running the handler 13271 1727203839.80437: _low_level_execute_command(): starting 13271 1727203839.80440: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203839.81215: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203839.81272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203839.81292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203839.81335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203839.81419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203839.83228: stdout chunk (state=3): >>>/root <<< 13271 1727203839.83384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203839.83388: stdout chunk (state=3): >>><<< 13271 1727203839.83390: stderr chunk (state=3): >>><<< 13271 1727203839.83512: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203839.83516: _low_level_execute_command(): starting 13271 1727203839.83519: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179 `" && echo ansible-tmp-1727203839.8341799-14798-46815704984179="` echo /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179 `" ) && sleep 0' 13271 1727203839.84079: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203839.84083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203839.84086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203839.84089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203839.84145: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203839.84148: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203839.84200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203839.84294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203839.86413: stdout chunk (state=3): >>>ansible-tmp-1727203839.8341799-14798-46815704984179=/root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179 <<< 13271 1727203839.86544: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203839.86558: stderr chunk (state=3): >>><<< 13271 1727203839.86564: stdout chunk (state=3): >>><<< 13271 1727203839.86582: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203839.8341799-14798-46815704984179=/root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203839.86609: variable 'ansible_module_compression' from source: unknown 13271 1727203839.86652: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203839.86688: variable 'ansible_facts' from source: unknown 13271 1727203839.86735: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/AnsiballZ_command.py 13271 1727203839.86838: Sending initial data 13271 1727203839.86842: Sent initial data (155 bytes) 13271 1727203839.87251: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203839.87289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203839.87292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203839.87295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 13271 1727203839.87297: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203839.87299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203839.87350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203839.87355: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203839.87357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203839.87435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203839.89177: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 13271 1727203839.89181: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203839.89250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203839.89325: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpvgqh3ivo /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/AnsiballZ_command.py <<< 13271 1727203839.89330: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/AnsiballZ_command.py" <<< 13271 1727203839.89400: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpvgqh3ivo" to remote "/root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/AnsiballZ_command.py" <<< 13271 1727203839.90092: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203839.90127: stderr chunk (state=3): >>><<< 13271 1727203839.90130: stdout chunk (state=3): >>><<< 13271 1727203839.90147: done transferring module to remote 13271 1727203839.90156: _low_level_execute_command(): starting 13271 1727203839.90162: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/ /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/AnsiballZ_command.py && sleep 0' 13271 1727203839.91279: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203839.91350: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203839.91362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203839.91460: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203839.91688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203839.93640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203839.93644: stdout chunk (state=3): >>><<< 13271 1727203839.93685: stderr chunk (state=3): >>><<< 13271 1727203839.93690: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203839.93693: _low_level_execute_command(): starting 13271 1727203839.93696: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/AnsiballZ_command.py && sleep 0' 13271 1727203839.94354: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203839.94420: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203839.94436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203839.94471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203839.94505: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203839.94574: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203839.94580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203839.94684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203839.94796: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203839.94924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.11786: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:50:40.112464", "end": "2024-09-24 14:50:40.116147", "delta": "0:00:00.003683", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203840.13698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203840.13702: stdout chunk (state=3): >>><<< 13271 1727203840.13705: stderr chunk (state=3): >>><<< 13271 1727203840.13725: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:50:40.112464", "end": "2024-09-24 14:50:40.116147", "delta": "0:00:00.003683", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203840.13768: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203840.13804: _low_level_execute_command(): starting 13271 1727203840.13807: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203839.8341799-14798-46815704984179/ > /dev/null 2>&1 && sleep 0' 13271 1727203840.14393: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203840.14407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.14430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.14516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.14545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.14642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.14667: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.14750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.17047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.17051: stdout chunk (state=3): >>><<< 13271 1727203840.17053: stderr chunk (state=3): >>><<< 13271 1727203840.17100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.17105: handler run complete 13271 1727203840.17286: Evaluated conditional (False): False 13271 1727203840.17323: variable 'result' from source: unknown 13271 1727203840.17339: Evaluated conditional ('110' in result.stdout): True 13271 1727203840.17350: attempt loop complete, returning result 13271 1727203840.17353: _execute() done 13271 1727203840.17356: dumping result to json 13271 1727203840.17362: done dumping result, returning 13271 1727203840.17374: done running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval [028d2410-947f-2a40-12ba-000000000071] 13271 1727203840.17380: sending task result for task 028d2410-947f-2a40-12ba-000000000071 13271 1727203840.17487: done sending task result for task 028d2410-947f-2a40-12ba-000000000071 13271 1727203840.17490: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003683", "end": "2024-09-24 14:50:40.116147", "rc": 0, "start": "2024-09-24 14:50:40.112464" } STDOUT: MII Polling Interval (ms): 110 13271 1727203840.17590: no more pending results, returning what we have 13271 1727203840.17594: results queue empty 13271 1727203840.17595: checking for any_errors_fatal 13271 1727203840.17603: done checking for any_errors_fatal 13271 1727203840.17604: checking for max_fail_percentage 13271 1727203840.17606: done checking for max_fail_percentage 13271 1727203840.17607: checking to see if all hosts have failed and the running result is not ok 13271 1727203840.17608: done checking to see if all hosts have failed 13271 1727203840.17608: getting the remaining hosts for this loop 13271 1727203840.17610: done getting the remaining hosts for this loop 13271 1727203840.17613: getting the next task for host managed-node1 13271 1727203840.17620: done getting next task for host managed-node1 13271 1727203840.17622: ^ task is: TASK: ** TEST check IPv4 13271 1727203840.17624: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203840.17628: getting variables 13271 1727203840.17630: in VariableManager get_vars() 13271 1727203840.17673: Calling all_inventory to load vars for managed-node1 13271 1727203840.17798: Calling groups_inventory to load vars for managed-node1 13271 1727203840.17802: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203840.17815: Calling all_plugins_play to load vars for managed-node1 13271 1727203840.17818: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203840.17821: Calling groups_plugins_play to load vars for managed-node1 13271 1727203840.19799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203840.23116: done with get_vars() 13271 1727203840.23145: done getting variables 13271 1727203840.23209: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Tuesday 24 September 2024 14:50:40 -0400 (0:00:00.457) 0:00:23.875 ***** 13271 1727203840.23239: entering _queue_task() for managed-node1/command 13271 1727203840.23899: worker is 1 (out of 1 available) 13271 1727203840.23913: exiting _queue_task() for managed-node1/command 13271 1727203840.23926: done queuing things up, now waiting for results queue to drain 13271 1727203840.23928: waiting for pending results... 13271 1727203840.24611: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 13271 1727203840.24658: in run() - task 028d2410-947f-2a40-12ba-000000000072 13271 1727203840.24681: variable 'ansible_search_path' from source: unknown 13271 1727203840.24863: calling self._execute() 13271 1727203840.25015: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203840.25019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203840.25218: variable 'omit' from source: magic vars 13271 1727203840.26086: variable 'ansible_distribution_major_version' from source: facts 13271 1727203840.26090: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203840.26092: variable 'omit' from source: magic vars 13271 1727203840.26095: variable 'omit' from source: magic vars 13271 1727203840.26108: variable 'controller_device' from source: play vars 13271 1727203840.26131: variable 'omit' from source: magic vars 13271 1727203840.26381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203840.26384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203840.26386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203840.26387: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203840.26389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203840.26505: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203840.26509: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203840.26512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203840.26726: Set connection var ansible_connection to ssh 13271 1727203840.26733: Set connection var ansible_shell_type to sh 13271 1727203840.26741: Set connection var ansible_timeout to 10 13271 1727203840.26746: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203840.26752: Set connection var ansible_pipelining to False 13271 1727203840.26758: Set connection var ansible_shell_executable to /bin/sh 13271 1727203840.27080: variable 'ansible_shell_executable' from source: unknown 13271 1727203840.27083: variable 'ansible_connection' from source: unknown 13271 1727203840.27086: variable 'ansible_module_compression' from source: unknown 13271 1727203840.27088: variable 'ansible_shell_type' from source: unknown 13271 1727203840.27091: variable 'ansible_shell_executable' from source: unknown 13271 1727203840.27093: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203840.27095: variable 'ansible_pipelining' from source: unknown 13271 1727203840.27097: variable 'ansible_timeout' from source: unknown 13271 1727203840.27099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203840.27102: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203840.27105: variable 'omit' from source: magic vars 13271 1727203840.27107: starting attempt loop 13271 1727203840.27110: running the handler 13271 1727203840.27112: _low_level_execute_command(): starting 13271 1727203840.27114: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203840.27850: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203840.27863: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.27880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.27901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.27911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203840.27951: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.28018: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203840.28032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.28047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.28159: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.29960: stdout chunk (state=3): >>>/root <<< 13271 1727203840.30110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.30116: stdout chunk (state=3): >>><<< 13271 1727203840.30125: stderr chunk (state=3): >>><<< 13271 1727203840.30153: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.30193: _low_level_execute_command(): starting 13271 1727203840.30197: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855 `" && echo ansible-tmp-1727203840.301538-14828-197010340223855="` echo /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855 `" ) && sleep 0' 13271 1727203840.31404: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.31546: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.31549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.31652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.31709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.31781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.33920: stdout chunk (state=3): >>>ansible-tmp-1727203840.301538-14828-197010340223855=/root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855 <<< 13271 1727203840.34083: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.34088: stdout chunk (state=3): >>><<< 13271 1727203840.34091: stderr chunk (state=3): >>><<< 13271 1727203840.34117: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203840.301538-14828-197010340223855=/root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.34197: variable 'ansible_module_compression' from source: unknown 13271 1727203840.34229: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203840.34280: variable 'ansible_facts' from source: unknown 13271 1727203840.34331: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/AnsiballZ_command.py 13271 1727203840.34441: Sending initial data 13271 1727203840.34444: Sent initial data (155 bytes) 13271 1727203840.34885: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.34889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203840.34893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.34895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.34897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203840.34908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.34954: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203840.34970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.35046: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.36801: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203840.36884: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203840.36969: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmper9gyrb9 /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/AnsiballZ_command.py <<< 13271 1727203840.36972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/AnsiballZ_command.py" <<< 13271 1727203840.37040: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmper9gyrb9" to remote "/root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/AnsiballZ_command.py" <<< 13271 1727203840.38008: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.38011: stdout chunk (state=3): >>><<< 13271 1727203840.38013: stderr chunk (state=3): >>><<< 13271 1727203840.38015: done transferring module to remote 13271 1727203840.38018: _low_level_execute_command(): starting 13271 1727203840.38020: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/ /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/AnsiballZ_command.py && sleep 0' 13271 1727203840.38571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203840.38588: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.38611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.38644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.38690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.38734: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203840.38738: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.38883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.38887: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.38996: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.41007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.41030: stderr chunk (state=3): >>><<< 13271 1727203840.41033: stdout chunk (state=3): >>><<< 13271 1727203840.41049: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.41126: _low_level_execute_command(): starting 13271 1727203840.41130: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/AnsiballZ_command.py && sleep 0' 13271 1727203840.41646: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203840.41664: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.41683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.41700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.41716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203840.41793: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.41820: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.41851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.41956: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.59256: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.124/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:50:40.586882", "end": "2024-09-24 14:50:40.590780", "delta": "0:00:00.003898", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203840.61183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203840.61187: stdout chunk (state=3): >>><<< 13271 1727203840.61189: stderr chunk (state=3): >>><<< 13271 1727203840.61191: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.124/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:50:40.586882", "end": "2024-09-24 14:50:40.590780", "delta": "0:00:00.003898", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203840.61194: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203840.61197: _low_level_execute_command(): starting 13271 1727203840.61199: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203840.301538-14828-197010340223855/ > /dev/null 2>&1 && sleep 0' 13271 1727203840.61764: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.61768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.61784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.61801: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.61804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.61868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203840.61871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.61874: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.61941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.64081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.64085: stdout chunk (state=3): >>><<< 13271 1727203840.64087: stderr chunk (state=3): >>><<< 13271 1727203840.64090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.64097: handler run complete 13271 1727203840.64102: Evaluated conditional (False): False 13271 1727203840.64195: variable 'result' from source: set_fact 13271 1727203840.64211: Evaluated conditional ('192.0.2' in result.stdout): True 13271 1727203840.64224: attempt loop complete, returning result 13271 1727203840.64227: _execute() done 13271 1727203840.64229: dumping result to json 13271 1727203840.64235: done dumping result, returning 13271 1727203840.64245: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 [028d2410-947f-2a40-12ba-000000000072] 13271 1727203840.64248: sending task result for task 028d2410-947f-2a40-12ba-000000000072 ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003898", "end": "2024-09-24 14:50:40.590780", "rc": 0, "start": "2024-09-24 14:50:40.586882" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.124/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 235sec preferred_lft 235sec 13271 1727203840.64498: no more pending results, returning what we have 13271 1727203840.64501: results queue empty 13271 1727203840.64502: checking for any_errors_fatal 13271 1727203840.64508: done checking for any_errors_fatal 13271 1727203840.64508: checking for max_fail_percentage 13271 1727203840.64510: done checking for max_fail_percentage 13271 1727203840.64511: checking to see if all hosts have failed and the running result is not ok 13271 1727203840.64512: done checking to see if all hosts have failed 13271 1727203840.64512: getting the remaining hosts for this loop 13271 1727203840.64513: done getting the remaining hosts for this loop 13271 1727203840.64517: getting the next task for host managed-node1 13271 1727203840.64521: done getting next task for host managed-node1 13271 1727203840.64524: ^ task is: TASK: ** TEST check IPv6 13271 1727203840.64525: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203840.64533: getting variables 13271 1727203840.64535: in VariableManager get_vars() 13271 1727203840.64572: Calling all_inventory to load vars for managed-node1 13271 1727203840.64575: Calling groups_inventory to load vars for managed-node1 13271 1727203840.64578: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203840.64617: Calling all_plugins_play to load vars for managed-node1 13271 1727203840.64621: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203840.64624: Calling groups_plugins_play to load vars for managed-node1 13271 1727203840.65151: done sending task result for task 028d2410-947f-2a40-12ba-000000000072 13271 1727203840.65154: WORKER PROCESS EXITING 13271 1727203840.66143: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203840.67754: done with get_vars() 13271 1727203840.67788: done getting variables 13271 1727203840.67850: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Tuesday 24 September 2024 14:50:40 -0400 (0:00:00.446) 0:00:24.322 ***** 13271 1727203840.67909: entering _queue_task() for managed-node1/command 13271 1727203840.68316: worker is 1 (out of 1 available) 13271 1727203840.68488: exiting _queue_task() for managed-node1/command 13271 1727203840.68500: done queuing things up, now waiting for results queue to drain 13271 1727203840.68502: waiting for pending results... 13271 1727203840.68812: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 13271 1727203840.68873: in run() - task 028d2410-947f-2a40-12ba-000000000073 13271 1727203840.68897: variable 'ansible_search_path' from source: unknown 13271 1727203840.68946: calling self._execute() 13271 1727203840.69163: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203840.69168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203840.69171: variable 'omit' from source: magic vars 13271 1727203840.69453: variable 'ansible_distribution_major_version' from source: facts 13271 1727203840.69474: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203840.69490: variable 'omit' from source: magic vars 13271 1727203840.69521: variable 'omit' from source: magic vars 13271 1727203840.69630: variable 'controller_device' from source: play vars 13271 1727203840.69652: variable 'omit' from source: magic vars 13271 1727203840.69705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203840.69751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203840.69783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203840.69820: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203840.69833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203840.69934: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203840.69937: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203840.69940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203840.69991: Set connection var ansible_connection to ssh 13271 1727203840.70003: Set connection var ansible_shell_type to sh 13271 1727203840.70015: Set connection var ansible_timeout to 10 13271 1727203840.70024: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203840.70032: Set connection var ansible_pipelining to False 13271 1727203840.70050: Set connection var ansible_shell_executable to /bin/sh 13271 1727203840.70083: variable 'ansible_shell_executable' from source: unknown 13271 1727203840.70092: variable 'ansible_connection' from source: unknown 13271 1727203840.70098: variable 'ansible_module_compression' from source: unknown 13271 1727203840.70104: variable 'ansible_shell_type' from source: unknown 13271 1727203840.70110: variable 'ansible_shell_executable' from source: unknown 13271 1727203840.70116: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203840.70151: variable 'ansible_pipelining' from source: unknown 13271 1727203840.70154: variable 'ansible_timeout' from source: unknown 13271 1727203840.70156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203840.70297: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203840.70320: variable 'omit' from source: magic vars 13271 1727203840.70373: starting attempt loop 13271 1727203840.70378: running the handler 13271 1727203840.70380: _low_level_execute_command(): starting 13271 1727203840.70382: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203840.71137: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203840.71194: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203840.71208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.71290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203840.71309: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.71330: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.71449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.73273: stdout chunk (state=3): >>>/root <<< 13271 1727203840.73486: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.73490: stdout chunk (state=3): >>><<< 13271 1727203840.73492: stderr chunk (state=3): >>><<< 13271 1727203840.73496: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.73516: _low_level_execute_command(): starting 13271 1727203840.73529: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633 `" && echo ansible-tmp-1727203840.735002-14852-159402822495633="` echo /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633 `" ) && sleep 0' 13271 1727203840.74265: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.74269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203840.74272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203840.74291: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203840.74294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.74341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203840.74345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.74424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.76559: stdout chunk (state=3): >>>ansible-tmp-1727203840.735002-14852-159402822495633=/root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633 <<< 13271 1727203840.76674: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.76691: stderr chunk (state=3): >>><<< 13271 1727203840.76695: stdout chunk (state=3): >>><<< 13271 1727203840.76711: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203840.735002-14852-159402822495633=/root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.76738: variable 'ansible_module_compression' from source: unknown 13271 1727203840.76791: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203840.76824: variable 'ansible_facts' from source: unknown 13271 1727203840.76873: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/AnsiballZ_command.py 13271 1727203840.76978: Sending initial data 13271 1727203840.76982: Sent initial data (155 bytes) 13271 1727203840.77413: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.77416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.77419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.77421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203840.77423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.77470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203840.77473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.77568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.79337: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203840.79429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203840.79509: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp3jw26kcs /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/AnsiballZ_command.py <<< 13271 1727203840.79517: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/AnsiballZ_command.py" <<< 13271 1727203840.79579: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp3jw26kcs" to remote "/root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/AnsiballZ_command.py" <<< 13271 1727203840.79582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/AnsiballZ_command.py" <<< 13271 1727203840.80262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.80304: stderr chunk (state=3): >>><<< 13271 1727203840.80307: stdout chunk (state=3): >>><<< 13271 1727203840.80328: done transferring module to remote 13271 1727203840.80337: _low_level_execute_command(): starting 13271 1727203840.80341: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/ /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/AnsiballZ_command.py && sleep 0' 13271 1727203840.81044: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.81152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.81156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.81520: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203840.83326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203840.83371: stderr chunk (state=3): >>><<< 13271 1727203840.83385: stdout chunk (state=3): >>><<< 13271 1727203840.83409: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203840.83417: _low_level_execute_command(): starting 13271 1727203840.83425: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/AnsiballZ_command.py && sleep 0' 13271 1727203840.83982: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203840.83996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.84008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203840.84025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203840.84041: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203840.84052: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203840.84064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.84089: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203840.84103: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203840.84114: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203840.84125: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203840.84190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203840.84216: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203840.84231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203840.84257: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203840.84378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203841.01318: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::13/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::b03b:fff:fee2:2a79/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::b03b:fff:fee2:2a79/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:50:41.007519", "end": "2024-09-24 14:50:41.011388", "delta": "0:00:00.003869", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203841.03115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203841.03139: stderr chunk (state=3): >>><<< 13271 1727203841.03143: stdout chunk (state=3): >>><<< 13271 1727203841.03159: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::13/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::b03b:fff:fee2:2a79/64 scope global dynamic noprefixroute \n valid_lft 1795sec preferred_lft 1795sec\n inet6 fe80::b03b:fff:fee2:2a79/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:50:41.007519", "end": "2024-09-24 14:50:41.011388", "delta": "0:00:00.003869", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203841.03194: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203841.03202: _low_level_execute_command(): starting 13271 1727203841.03208: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203840.735002-14852-159402822495633/ > /dev/null 2>&1 && sleep 0' 13271 1727203841.03837: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203841.03856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203841.03984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203841.05984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203841.06000: stdout chunk (state=3): >>><<< 13271 1727203841.06002: stderr chunk (state=3): >>><<< 13271 1727203841.06082: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203841.06087: handler run complete 13271 1727203841.06089: Evaluated conditional (False): False 13271 1727203841.06149: variable 'result' from source: set_fact 13271 1727203841.06181: Evaluated conditional ('2001' in result.stdout): True 13271 1727203841.06184: attempt loop complete, returning result 13271 1727203841.06187: _execute() done 13271 1727203841.06206: dumping result to json 13271 1727203841.06209: done dumping result, returning 13271 1727203841.06212: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 [028d2410-947f-2a40-12ba-000000000073] 13271 1727203841.06214: sending task result for task 028d2410-947f-2a40-12ba-000000000073 13271 1727203841.06306: done sending task result for task 028d2410-947f-2a40-12ba-000000000073 13271 1727203841.06308: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003869", "end": "2024-09-24 14:50:41.011388", "rc": 0, "start": "2024-09-24 14:50:41.007519" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::13/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::b03b:fff:fee2:2a79/64 scope global dynamic noprefixroute valid_lft 1795sec preferred_lft 1795sec inet6 fe80::b03b:fff:fee2:2a79/64 scope link noprefixroute valid_lft forever preferred_lft forever 13271 1727203841.06401: no more pending results, returning what we have 13271 1727203841.06405: results queue empty 13271 1727203841.06405: checking for any_errors_fatal 13271 1727203841.06413: done checking for any_errors_fatal 13271 1727203841.06413: checking for max_fail_percentage 13271 1727203841.06415: done checking for max_fail_percentage 13271 1727203841.06416: checking to see if all hosts have failed and the running result is not ok 13271 1727203841.06417: done checking to see if all hosts have failed 13271 1727203841.06418: getting the remaining hosts for this loop 13271 1727203841.06419: done getting the remaining hosts for this loop 13271 1727203841.06422: getting the next task for host managed-node1 13271 1727203841.06434: done getting next task for host managed-node1 13271 1727203841.06438: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13271 1727203841.06442: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203841.06459: getting variables 13271 1727203841.06460: in VariableManager get_vars() 13271 1727203841.06501: Calling all_inventory to load vars for managed-node1 13271 1727203841.06503: Calling groups_inventory to load vars for managed-node1 13271 1727203841.06505: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203841.06514: Calling all_plugins_play to load vars for managed-node1 13271 1727203841.06517: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203841.06519: Calling groups_plugins_play to load vars for managed-node1 13271 1727203841.07939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203841.10089: done with get_vars() 13271 1727203841.10115: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:50:41 -0400 (0:00:00.423) 0:00:24.745 ***** 13271 1727203841.10216: entering _queue_task() for managed-node1/include_tasks 13271 1727203841.10542: worker is 1 (out of 1 available) 13271 1727203841.10553: exiting _queue_task() for managed-node1/include_tasks 13271 1727203841.10566: done queuing things up, now waiting for results queue to drain 13271 1727203841.10568: waiting for pending results... 13271 1727203841.10904: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 13271 1727203841.11014: in run() - task 028d2410-947f-2a40-12ba-00000000007c 13271 1727203841.11032: variable 'ansible_search_path' from source: unknown 13271 1727203841.11038: variable 'ansible_search_path' from source: unknown 13271 1727203841.11081: calling self._execute() 13271 1727203841.11183: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203841.11217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203841.11222: variable 'omit' from source: magic vars 13271 1727203841.11596: variable 'ansible_distribution_major_version' from source: facts 13271 1727203841.11612: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203841.11654: _execute() done 13271 1727203841.11658: dumping result to json 13271 1727203841.11660: done dumping result, returning 13271 1727203841.11663: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [028d2410-947f-2a40-12ba-00000000007c] 13271 1727203841.11665: sending task result for task 028d2410-947f-2a40-12ba-00000000007c 13271 1727203841.11813: no more pending results, returning what we have 13271 1727203841.11818: in VariableManager get_vars() 13271 1727203841.11872: Calling all_inventory to load vars for managed-node1 13271 1727203841.11877: Calling groups_inventory to load vars for managed-node1 13271 1727203841.11880: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203841.11895: Calling all_plugins_play to load vars for managed-node1 13271 1727203841.11898: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203841.11902: Calling groups_plugins_play to load vars for managed-node1 13271 1727203841.12790: done sending task result for task 028d2410-947f-2a40-12ba-00000000007c 13271 1727203841.12793: WORKER PROCESS EXITING 13271 1727203841.13619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203841.15167: done with get_vars() 13271 1727203841.15190: variable 'ansible_search_path' from source: unknown 13271 1727203841.15192: variable 'ansible_search_path' from source: unknown 13271 1727203841.15233: we have included files to process 13271 1727203841.15235: generating all_blocks data 13271 1727203841.15237: done generating all_blocks data 13271 1727203841.15241: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13271 1727203841.15242: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13271 1727203841.15245: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 13271 1727203841.16207: done processing included file 13271 1727203841.16210: iterating over new_blocks loaded from include file 13271 1727203841.16211: in VariableManager get_vars() 13271 1727203841.16237: done with get_vars() 13271 1727203841.16239: filtering new block on tags 13271 1727203841.16269: done filtering new block on tags 13271 1727203841.16272: in VariableManager get_vars() 13271 1727203841.16501: done with get_vars() 13271 1727203841.16503: filtering new block on tags 13271 1727203841.16542: done filtering new block on tags 13271 1727203841.16545: in VariableManager get_vars() 13271 1727203841.16567: done with get_vars() 13271 1727203841.16568: filtering new block on tags 13271 1727203841.16607: done filtering new block on tags 13271 1727203841.16610: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 13271 1727203841.16615: extending task lists for all hosts with included blocks 13271 1727203841.18878: done extending task lists 13271 1727203841.18880: done processing included files 13271 1727203841.18880: results queue empty 13271 1727203841.18881: checking for any_errors_fatal 13271 1727203841.18885: done checking for any_errors_fatal 13271 1727203841.18886: checking for max_fail_percentage 13271 1727203841.18887: done checking for max_fail_percentage 13271 1727203841.18888: checking to see if all hosts have failed and the running result is not ok 13271 1727203841.18889: done checking to see if all hosts have failed 13271 1727203841.18890: getting the remaining hosts for this loop 13271 1727203841.18891: done getting the remaining hosts for this loop 13271 1727203841.18894: getting the next task for host managed-node1 13271 1727203841.18899: done getting next task for host managed-node1 13271 1727203841.18902: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13271 1727203841.18906: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203841.18916: getting variables 13271 1727203841.18918: in VariableManager get_vars() 13271 1727203841.18936: Calling all_inventory to load vars for managed-node1 13271 1727203841.18938: Calling groups_inventory to load vars for managed-node1 13271 1727203841.18940: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203841.18946: Calling all_plugins_play to load vars for managed-node1 13271 1727203841.18948: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203841.18951: Calling groups_plugins_play to load vars for managed-node1 13271 1727203841.20147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203841.21781: done with get_vars() 13271 1727203841.21801: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:50:41 -0400 (0:00:00.116) 0:00:24.861 ***** 13271 1727203841.21881: entering _queue_task() for managed-node1/setup 13271 1727203841.22228: worker is 1 (out of 1 available) 13271 1727203841.22240: exiting _queue_task() for managed-node1/setup 13271 1727203841.22252: done queuing things up, now waiting for results queue to drain 13271 1727203841.22255: waiting for pending results... 13271 1727203841.22536: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 13271 1727203841.22717: in run() - task 028d2410-947f-2a40-12ba-000000000491 13271 1727203841.22739: variable 'ansible_search_path' from source: unknown 13271 1727203841.22746: variable 'ansible_search_path' from source: unknown 13271 1727203841.22792: calling self._execute() 13271 1727203841.22901: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203841.22917: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203841.22934: variable 'omit' from source: magic vars 13271 1727203841.23311: variable 'ansible_distribution_major_version' from source: facts 13271 1727203841.23328: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203841.23682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203841.25793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203841.25877: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203841.25917: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203841.25953: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203841.25991: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203841.26078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203841.26113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203841.26141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203841.26193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203841.26211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203841.26267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203841.26300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203841.26327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203841.26371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203841.26393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203841.26556: variable '__network_required_facts' from source: role '' defaults 13271 1727203841.26615: variable 'ansible_facts' from source: unknown 13271 1727203841.27331: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 13271 1727203841.27339: when evaluation is False, skipping this task 13271 1727203841.27346: _execute() done 13271 1727203841.27354: dumping result to json 13271 1727203841.27364: done dumping result, returning 13271 1727203841.27381: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [028d2410-947f-2a40-12ba-000000000491] 13271 1727203841.27392: sending task result for task 028d2410-947f-2a40-12ba-000000000491 13271 1727203841.27545: done sending task result for task 028d2410-947f-2a40-12ba-000000000491 13271 1727203841.27548: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203841.27596: no more pending results, returning what we have 13271 1727203841.27601: results queue empty 13271 1727203841.27602: checking for any_errors_fatal 13271 1727203841.27603: done checking for any_errors_fatal 13271 1727203841.27604: checking for max_fail_percentage 13271 1727203841.27605: done checking for max_fail_percentage 13271 1727203841.27606: checking to see if all hosts have failed and the running result is not ok 13271 1727203841.27607: done checking to see if all hosts have failed 13271 1727203841.27608: getting the remaining hosts for this loop 13271 1727203841.27609: done getting the remaining hosts for this loop 13271 1727203841.27612: getting the next task for host managed-node1 13271 1727203841.27621: done getting next task for host managed-node1 13271 1727203841.27625: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 13271 1727203841.27630: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203841.27648: getting variables 13271 1727203841.27649: in VariableManager get_vars() 13271 1727203841.27694: Calling all_inventory to load vars for managed-node1 13271 1727203841.27697: Calling groups_inventory to load vars for managed-node1 13271 1727203841.27699: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203841.27710: Calling all_plugins_play to load vars for managed-node1 13271 1727203841.27712: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203841.27715: Calling groups_plugins_play to load vars for managed-node1 13271 1727203841.29350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203841.30931: done with get_vars() 13271 1727203841.30955: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:50:41 -0400 (0:00:00.091) 0:00:24.953 ***** 13271 1727203841.31066: entering _queue_task() for managed-node1/stat 13271 1727203841.31391: worker is 1 (out of 1 available) 13271 1727203841.31403: exiting _queue_task() for managed-node1/stat 13271 1727203841.31416: done queuing things up, now waiting for results queue to drain 13271 1727203841.31418: waiting for pending results... 13271 1727203841.31797: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 13271 1727203841.31880: in run() - task 028d2410-947f-2a40-12ba-000000000493 13271 1727203841.31906: variable 'ansible_search_path' from source: unknown 13271 1727203841.31913: variable 'ansible_search_path' from source: unknown 13271 1727203841.31952: calling self._execute() 13271 1727203841.32051: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203841.32065: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203841.32082: variable 'omit' from source: magic vars 13271 1727203841.32450: variable 'ansible_distribution_major_version' from source: facts 13271 1727203841.32470: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203841.32640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203841.32909: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203841.32953: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203841.32999: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203841.33042: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203841.33198: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203841.33202: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203841.33212: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203841.33244: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203841.33345: variable '__network_is_ostree' from source: set_fact 13271 1727203841.33357: Evaluated conditional (not __network_is_ostree is defined): False 13271 1727203841.33369: when evaluation is False, skipping this task 13271 1727203841.33378: _execute() done 13271 1727203841.33386: dumping result to json 13271 1727203841.33394: done dumping result, returning 13271 1727203841.33405: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [028d2410-947f-2a40-12ba-000000000493] 13271 1727203841.33523: sending task result for task 028d2410-947f-2a40-12ba-000000000493 13271 1727203841.33599: done sending task result for task 028d2410-947f-2a40-12ba-000000000493 13271 1727203841.33602: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13271 1727203841.33684: no more pending results, returning what we have 13271 1727203841.33688: results queue empty 13271 1727203841.33689: checking for any_errors_fatal 13271 1727203841.33697: done checking for any_errors_fatal 13271 1727203841.33698: checking for max_fail_percentage 13271 1727203841.33700: done checking for max_fail_percentage 13271 1727203841.33701: checking to see if all hosts have failed and the running result is not ok 13271 1727203841.33702: done checking to see if all hosts have failed 13271 1727203841.33703: getting the remaining hosts for this loop 13271 1727203841.33705: done getting the remaining hosts for this loop 13271 1727203841.33709: getting the next task for host managed-node1 13271 1727203841.33718: done getting next task for host managed-node1 13271 1727203841.33722: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13271 1727203841.33727: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203841.33747: getting variables 13271 1727203841.33749: in VariableManager get_vars() 13271 1727203841.33799: Calling all_inventory to load vars for managed-node1 13271 1727203841.33803: Calling groups_inventory to load vars for managed-node1 13271 1727203841.33805: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203841.33817: Calling all_plugins_play to load vars for managed-node1 13271 1727203841.33821: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203841.33824: Calling groups_plugins_play to load vars for managed-node1 13271 1727203841.35516: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203841.37111: done with get_vars() 13271 1727203841.37135: done getting variables 13271 1727203841.37197: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:50:41 -0400 (0:00:00.061) 0:00:25.015 ***** 13271 1727203841.37235: entering _queue_task() for managed-node1/set_fact 13271 1727203841.37571: worker is 1 (out of 1 available) 13271 1727203841.37585: exiting _queue_task() for managed-node1/set_fact 13271 1727203841.37598: done queuing things up, now waiting for results queue to drain 13271 1727203841.37600: waiting for pending results... 13271 1727203841.37997: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 13271 1727203841.38057: in run() - task 028d2410-947f-2a40-12ba-000000000494 13271 1727203841.38081: variable 'ansible_search_path' from source: unknown 13271 1727203841.38092: variable 'ansible_search_path' from source: unknown 13271 1727203841.38129: calling self._execute() 13271 1727203841.38231: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203841.38244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203841.38264: variable 'omit' from source: magic vars 13271 1727203841.38655: variable 'ansible_distribution_major_version' from source: facts 13271 1727203841.38678: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203841.38860: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203841.39151: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203841.39209: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203841.39248: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203841.39295: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203841.39384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203841.39420: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203841.39451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203841.39490: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203841.39585: variable '__network_is_ostree' from source: set_fact 13271 1727203841.39598: Evaluated conditional (not __network_is_ostree is defined): False 13271 1727203841.39605: when evaluation is False, skipping this task 13271 1727203841.39617: _execute() done 13271 1727203841.39625: dumping result to json 13271 1727203841.39633: done dumping result, returning 13271 1727203841.39645: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [028d2410-947f-2a40-12ba-000000000494] 13271 1727203841.39724: sending task result for task 028d2410-947f-2a40-12ba-000000000494 13271 1727203841.39795: done sending task result for task 028d2410-947f-2a40-12ba-000000000494 13271 1727203841.39798: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 13271 1727203841.39882: no more pending results, returning what we have 13271 1727203841.39886: results queue empty 13271 1727203841.39887: checking for any_errors_fatal 13271 1727203841.39893: done checking for any_errors_fatal 13271 1727203841.39894: checking for max_fail_percentage 13271 1727203841.39896: done checking for max_fail_percentage 13271 1727203841.39897: checking to see if all hosts have failed and the running result is not ok 13271 1727203841.39898: done checking to see if all hosts have failed 13271 1727203841.39899: getting the remaining hosts for this loop 13271 1727203841.39900: done getting the remaining hosts for this loop 13271 1727203841.39904: getting the next task for host managed-node1 13271 1727203841.39914: done getting next task for host managed-node1 13271 1727203841.39919: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 13271 1727203841.39924: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203841.39943: getting variables 13271 1727203841.39946: in VariableManager get_vars() 13271 1727203841.39992: Calling all_inventory to load vars for managed-node1 13271 1727203841.39995: Calling groups_inventory to load vars for managed-node1 13271 1727203841.39998: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203841.40010: Calling all_plugins_play to load vars for managed-node1 13271 1727203841.40013: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203841.40016: Calling groups_plugins_play to load vars for managed-node1 13271 1727203841.41554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203841.43183: done with get_vars() 13271 1727203841.43206: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:50:41 -0400 (0:00:00.060) 0:00:25.076 ***** 13271 1727203841.43303: entering _queue_task() for managed-node1/service_facts 13271 1727203841.43612: worker is 1 (out of 1 available) 13271 1727203841.43622: exiting _queue_task() for managed-node1/service_facts 13271 1727203841.43634: done queuing things up, now waiting for results queue to drain 13271 1727203841.43636: waiting for pending results... 13271 1727203841.44003: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 13271 1727203841.44105: in run() - task 028d2410-947f-2a40-12ba-000000000496 13271 1727203841.44128: variable 'ansible_search_path' from source: unknown 13271 1727203841.44137: variable 'ansible_search_path' from source: unknown 13271 1727203841.44182: calling self._execute() 13271 1727203841.44269: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203841.44282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203841.44295: variable 'omit' from source: magic vars 13271 1727203841.44686: variable 'ansible_distribution_major_version' from source: facts 13271 1727203841.44880: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203841.44883: variable 'omit' from source: magic vars 13271 1727203841.44885: variable 'omit' from source: magic vars 13271 1727203841.44887: variable 'omit' from source: magic vars 13271 1727203841.44889: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203841.44928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203841.44955: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203841.44983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203841.45005: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203841.45043: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203841.45053: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203841.45066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203841.45178: Set connection var ansible_connection to ssh 13271 1727203841.45192: Set connection var ansible_shell_type to sh 13271 1727203841.45207: Set connection var ansible_timeout to 10 13271 1727203841.45222: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203841.45232: Set connection var ansible_pipelining to False 13271 1727203841.45242: Set connection var ansible_shell_executable to /bin/sh 13271 1727203841.45274: variable 'ansible_shell_executable' from source: unknown 13271 1727203841.45285: variable 'ansible_connection' from source: unknown 13271 1727203841.45293: variable 'ansible_module_compression' from source: unknown 13271 1727203841.45301: variable 'ansible_shell_type' from source: unknown 13271 1727203841.45308: variable 'ansible_shell_executable' from source: unknown 13271 1727203841.45314: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203841.45322: variable 'ansible_pipelining' from source: unknown 13271 1727203841.45335: variable 'ansible_timeout' from source: unknown 13271 1727203841.45440: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203841.45567: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203841.45586: variable 'omit' from source: magic vars 13271 1727203841.45596: starting attempt loop 13271 1727203841.45604: running the handler 13271 1727203841.45622: _low_level_execute_command(): starting 13271 1727203841.45635: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203841.46389: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203841.46426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203841.46445: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203841.46538: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203841.46554: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203841.46582: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203841.46714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203841.48500: stdout chunk (state=3): >>>/root <<< 13271 1727203841.48652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203841.48656: stdout chunk (state=3): >>><<< 13271 1727203841.48659: stderr chunk (state=3): >>><<< 13271 1727203841.48682: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203841.48704: _low_level_execute_command(): starting 13271 1727203841.48793: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088 `" && echo ansible-tmp-1727203841.4869082-14938-30106635594088="` echo /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088 `" ) && sleep 0' 13271 1727203841.49343: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203841.49358: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203841.49387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203841.49442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203841.49529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203841.49535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203841.49560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203841.49688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203841.51798: stdout chunk (state=3): >>>ansible-tmp-1727203841.4869082-14938-30106635594088=/root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088 <<< 13271 1727203841.52084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203841.52087: stdout chunk (state=3): >>><<< 13271 1727203841.52090: stderr chunk (state=3): >>><<< 13271 1727203841.52092: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203841.4869082-14938-30106635594088=/root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203841.52094: variable 'ansible_module_compression' from source: unknown 13271 1727203841.52096: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 13271 1727203841.52133: variable 'ansible_facts' from source: unknown 13271 1727203841.52216: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/AnsiballZ_service_facts.py 13271 1727203841.52429: Sending initial data 13271 1727203841.52432: Sent initial data (161 bytes) 13271 1727203841.52965: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203841.53071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203841.53077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203841.53125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203841.53202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203841.54983: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203841.55045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203841.55148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmplq2m8y7t /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/AnsiballZ_service_facts.py <<< 13271 1727203841.55158: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/AnsiballZ_service_facts.py" <<< 13271 1727203841.55213: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmplq2m8y7t" to remote "/root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/AnsiballZ_service_facts.py" <<< 13271 1727203841.56142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203841.56157: stdout chunk (state=3): >>><<< 13271 1727203841.56181: stderr chunk (state=3): >>><<< 13271 1727203841.56284: done transferring module to remote 13271 1727203841.56369: _low_level_execute_command(): starting 13271 1727203841.56372: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/ /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/AnsiballZ_service_facts.py && sleep 0' 13271 1727203841.56891: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203841.56899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203841.56909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203841.56922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203841.56934: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203841.56941: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203841.56949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203841.56963: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203841.56973: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203841.56981: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203841.57089: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203841.57097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203841.57100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203841.57298: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203841.59483: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203841.59487: stderr chunk (state=3): >>><<< 13271 1727203841.59490: stdout chunk (state=3): >>><<< 13271 1727203841.59492: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203841.59495: _low_level_execute_command(): starting 13271 1727203841.59498: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/AnsiballZ_service_facts.py && sleep 0' 13271 1727203841.59987: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203841.59996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203841.60090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203841.60111: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203841.60124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203841.60141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203841.60257: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203843.35793: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 13271 1727203843.37364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203843.37368: stdout chunk (state=3): >>><<< 13271 1727203843.37381: stderr chunk (state=3): >>><<< 13271 1727203843.37425: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203843.40116: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203843.40184: _low_level_execute_command(): starting 13271 1727203843.40188: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203841.4869082-14938-30106635594088/ > /dev/null 2>&1 && sleep 0' 13271 1727203843.41426: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203843.41441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203843.41695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203843.41712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203843.41809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203843.43803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203843.43916: stderr chunk (state=3): >>><<< 13271 1727203843.43919: stdout chunk (state=3): >>><<< 13271 1727203843.43935: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203843.43941: handler run complete 13271 1727203843.44287: variable 'ansible_facts' from source: unknown 13271 1727203843.44853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203843.45855: variable 'ansible_facts' from source: unknown 13271 1727203843.46034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203843.46466: attempt loop complete, returning result 13271 1727203843.46470: _execute() done 13271 1727203843.46472: dumping result to json 13271 1727203843.46655: done dumping result, returning 13271 1727203843.46666: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [028d2410-947f-2a40-12ba-000000000496] 13271 1727203843.46669: sending task result for task 028d2410-947f-2a40-12ba-000000000496 13271 1727203843.48614: done sending task result for task 028d2410-947f-2a40-12ba-000000000496 13271 1727203843.48618: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203843.48737: no more pending results, returning what we have 13271 1727203843.48740: results queue empty 13271 1727203843.48741: checking for any_errors_fatal 13271 1727203843.48745: done checking for any_errors_fatal 13271 1727203843.48746: checking for max_fail_percentage 13271 1727203843.48747: done checking for max_fail_percentage 13271 1727203843.48748: checking to see if all hosts have failed and the running result is not ok 13271 1727203843.48749: done checking to see if all hosts have failed 13271 1727203843.48750: getting the remaining hosts for this loop 13271 1727203843.48751: done getting the remaining hosts for this loop 13271 1727203843.48754: getting the next task for host managed-node1 13271 1727203843.48760: done getting next task for host managed-node1 13271 1727203843.48766: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 13271 1727203843.48772: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203843.48785: getting variables 13271 1727203843.48786: in VariableManager get_vars() 13271 1727203843.48816: Calling all_inventory to load vars for managed-node1 13271 1727203843.48819: Calling groups_inventory to load vars for managed-node1 13271 1727203843.48821: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203843.48831: Calling all_plugins_play to load vars for managed-node1 13271 1727203843.48834: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203843.48837: Calling groups_plugins_play to load vars for managed-node1 13271 1727203843.51877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203843.55436: done with get_vars() 13271 1727203843.55464: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:50:43 -0400 (0:00:02.123) 0:00:27.200 ***** 13271 1727203843.55692: entering _queue_task() for managed-node1/package_facts 13271 1727203843.56603: worker is 1 (out of 1 available) 13271 1727203843.56615: exiting _queue_task() for managed-node1/package_facts 13271 1727203843.56628: done queuing things up, now waiting for results queue to drain 13271 1727203843.56630: waiting for pending results... 13271 1727203843.57192: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 13271 1727203843.57581: in run() - task 028d2410-947f-2a40-12ba-000000000497 13271 1727203843.57586: variable 'ansible_search_path' from source: unknown 13271 1727203843.57589: variable 'ansible_search_path' from source: unknown 13271 1727203843.57592: calling self._execute() 13271 1727203843.57981: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203843.57984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203843.57987: variable 'omit' from source: magic vars 13271 1727203843.58357: variable 'ansible_distribution_major_version' from source: facts 13271 1727203843.58781: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203843.58785: variable 'omit' from source: magic vars 13271 1727203843.58788: variable 'omit' from source: magic vars 13271 1727203843.58790: variable 'omit' from source: magic vars 13271 1727203843.58792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203843.58989: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203843.59015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203843.59037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203843.59054: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203843.59092: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203843.59102: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203843.59110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203843.59414: Set connection var ansible_connection to ssh 13271 1727203843.59427: Set connection var ansible_shell_type to sh 13271 1727203843.59440: Set connection var ansible_timeout to 10 13271 1727203843.59449: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203843.59460: Set connection var ansible_pipelining to False 13271 1727203843.59473: Set connection var ansible_shell_executable to /bin/sh 13271 1727203843.59506: variable 'ansible_shell_executable' from source: unknown 13271 1727203843.59526: variable 'ansible_connection' from source: unknown 13271 1727203843.59533: variable 'ansible_module_compression' from source: unknown 13271 1727203843.59539: variable 'ansible_shell_type' from source: unknown 13271 1727203843.59544: variable 'ansible_shell_executable' from source: unknown 13271 1727203843.59550: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203843.59557: variable 'ansible_pipelining' from source: unknown 13271 1727203843.59565: variable 'ansible_timeout' from source: unknown 13271 1727203843.59573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203843.59865: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203843.60094: variable 'omit' from source: magic vars 13271 1727203843.60104: starting attempt loop 13271 1727203843.60111: running the handler 13271 1727203843.60127: _low_level_execute_command(): starting 13271 1727203843.60138: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203843.61095: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203843.61110: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203843.61191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203843.61229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203843.61248: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203843.61273: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203843.61394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203843.63242: stdout chunk (state=3): >>>/root <<< 13271 1727203843.63478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203843.63482: stdout chunk (state=3): >>><<< 13271 1727203843.63484: stderr chunk (state=3): >>><<< 13271 1727203843.63529: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203843.63569: _low_level_execute_command(): starting 13271 1727203843.63585: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109 `" && echo ansible-tmp-1727203843.6354458-15092-187199344481109="` echo /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109 `" ) && sleep 0' 13271 1727203843.64294: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203843.64309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203843.64325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203843.64433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203843.64458: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203843.64492: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203843.64507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203843.64618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203843.66742: stdout chunk (state=3): >>>ansible-tmp-1727203843.6354458-15092-187199344481109=/root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109 <<< 13271 1727203843.66908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203843.66911: stdout chunk (state=3): >>><<< 13271 1727203843.66914: stderr chunk (state=3): >>><<< 13271 1727203843.66931: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203843.6354458-15092-187199344481109=/root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203843.67083: variable 'ansible_module_compression' from source: unknown 13271 1727203843.67087: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 13271 1727203843.67116: variable 'ansible_facts' from source: unknown 13271 1727203843.67310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/AnsiballZ_package_facts.py 13271 1727203843.67497: Sending initial data 13271 1727203843.67506: Sent initial data (162 bytes) 13271 1727203843.68241: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203843.68293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203843.68307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203843.68317: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203843.68326: stderr chunk (state=3): >>>debug2: match found <<< 13271 1727203843.68401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203843.68422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203843.68445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203843.68470: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203843.68633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203843.70366: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13271 1727203843.70410: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13271 1727203843.70414: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203843.70491: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203843.70625: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpgowwkct8 /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/AnsiballZ_package_facts.py <<< 13271 1727203843.70629: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/AnsiballZ_package_facts.py" <<< 13271 1727203843.70686: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpgowwkct8" to remote "/root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/AnsiballZ_package_facts.py" <<< 13271 1727203843.73391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203843.73394: stdout chunk (state=3): >>><<< 13271 1727203843.73400: stderr chunk (state=3): >>><<< 13271 1727203843.73422: done transferring module to remote 13271 1727203843.73433: _low_level_execute_command(): starting 13271 1727203843.73441: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/ /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/AnsiballZ_package_facts.py && sleep 0' 13271 1727203843.74508: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203843.74512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203843.74515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203843.74517: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203843.74520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203843.74744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203843.74814: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203843.74846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203843.74990: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203843.77082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203843.77087: stdout chunk (state=3): >>><<< 13271 1727203843.77089: stderr chunk (state=3): >>><<< 13271 1727203843.77105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203843.77449: _low_level_execute_command(): starting 13271 1727203843.77453: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/AnsiballZ_package_facts.py && sleep 0' 13271 1727203843.78582: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203843.78794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203843.78926: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203844.25842: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 13271 1727203844.25999: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el1<<< 13271 1727203844.26054: stdout chunk (state=3): >>>0", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 13271 1727203844.28023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203844.28056: stderr chunk (state=3): >>><<< 13271 1727203844.28059: stdout chunk (state=3): >>><<< 13271 1727203844.28093: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "25.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "7.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203844.29489: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203844.29509: _low_level_execute_command(): starting 13271 1727203844.29524: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203843.6354458-15092-187199344481109/ > /dev/null 2>&1 && sleep 0' 13271 1727203844.29969: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203844.29972: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203844.29974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203844.29979: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203844.29981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203844.30036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203844.30039: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203844.30045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203844.30125: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203844.32136: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203844.32140: stderr chunk (state=3): >>><<< 13271 1727203844.32143: stdout chunk (state=3): >>><<< 13271 1727203844.32145: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203844.32148: handler run complete 13271 1727203844.32677: variable 'ansible_facts' from source: unknown 13271 1727203844.33203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203844.34878: variable 'ansible_facts' from source: unknown 13271 1727203844.35270: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203844.36019: attempt loop complete, returning result 13271 1727203844.36030: _execute() done 13271 1727203844.36046: dumping result to json 13271 1727203844.36383: done dumping result, returning 13271 1727203844.36386: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [028d2410-947f-2a40-12ba-000000000497] 13271 1727203844.36388: sending task result for task 028d2410-947f-2a40-12ba-000000000497 13271 1727203844.38640: done sending task result for task 028d2410-947f-2a40-12ba-000000000497 13271 1727203844.38643: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203844.38795: no more pending results, returning what we have 13271 1727203844.38797: results queue empty 13271 1727203844.38798: checking for any_errors_fatal 13271 1727203844.38802: done checking for any_errors_fatal 13271 1727203844.38803: checking for max_fail_percentage 13271 1727203844.38805: done checking for max_fail_percentage 13271 1727203844.38805: checking to see if all hosts have failed and the running result is not ok 13271 1727203844.38806: done checking to see if all hosts have failed 13271 1727203844.38807: getting the remaining hosts for this loop 13271 1727203844.38808: done getting the remaining hosts for this loop 13271 1727203844.38811: getting the next task for host managed-node1 13271 1727203844.38818: done getting next task for host managed-node1 13271 1727203844.38821: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 13271 1727203844.38825: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203844.38835: getting variables 13271 1727203844.38837: in VariableManager get_vars() 13271 1727203844.39006: Calling all_inventory to load vars for managed-node1 13271 1727203844.39009: Calling groups_inventory to load vars for managed-node1 13271 1727203844.39011: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203844.39021: Calling all_plugins_play to load vars for managed-node1 13271 1727203844.39023: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203844.39026: Calling groups_plugins_play to load vars for managed-node1 13271 1727203844.40480: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203844.42191: done with get_vars() 13271 1727203844.42228: done getting variables 13271 1727203844.42340: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:50:44 -0400 (0:00:00.867) 0:00:28.067 ***** 13271 1727203844.42403: entering _queue_task() for managed-node1/debug 13271 1727203844.42963: worker is 1 (out of 1 available) 13271 1727203844.43118: exiting _queue_task() for managed-node1/debug 13271 1727203844.43129: done queuing things up, now waiting for results queue to drain 13271 1727203844.43130: waiting for pending results... 13271 1727203844.43352: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 13271 1727203844.43451: in run() - task 028d2410-947f-2a40-12ba-00000000007d 13271 1727203844.43554: variable 'ansible_search_path' from source: unknown 13271 1727203844.43558: variable 'ansible_search_path' from source: unknown 13271 1727203844.43561: calling self._execute() 13271 1727203844.43625: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203844.43638: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203844.43659: variable 'omit' from source: magic vars 13271 1727203844.44048: variable 'ansible_distribution_major_version' from source: facts 13271 1727203844.44064: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203844.44075: variable 'omit' from source: magic vars 13271 1727203844.44150: variable 'omit' from source: magic vars 13271 1727203844.44320: variable 'network_provider' from source: set_fact 13271 1727203844.44325: variable 'omit' from source: magic vars 13271 1727203844.44337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203844.44377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203844.44403: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203844.44431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203844.44581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203844.44584: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203844.44586: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203844.44589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203844.44600: Set connection var ansible_connection to ssh 13271 1727203844.44613: Set connection var ansible_shell_type to sh 13271 1727203844.44625: Set connection var ansible_timeout to 10 13271 1727203844.44634: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203844.44643: Set connection var ansible_pipelining to False 13271 1727203844.44651: Set connection var ansible_shell_executable to /bin/sh 13271 1727203844.44682: variable 'ansible_shell_executable' from source: unknown 13271 1727203844.44690: variable 'ansible_connection' from source: unknown 13271 1727203844.44696: variable 'ansible_module_compression' from source: unknown 13271 1727203844.44713: variable 'ansible_shell_type' from source: unknown 13271 1727203844.44719: variable 'ansible_shell_executable' from source: unknown 13271 1727203844.44727: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203844.44736: variable 'ansible_pipelining' from source: unknown 13271 1727203844.44742: variable 'ansible_timeout' from source: unknown 13271 1727203844.44818: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203844.44894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203844.44910: variable 'omit' from source: magic vars 13271 1727203844.44928: starting attempt loop 13271 1727203844.44934: running the handler 13271 1727203844.44983: handler run complete 13271 1727203844.45000: attempt loop complete, returning result 13271 1727203844.45006: _execute() done 13271 1727203844.45013: dumping result to json 13271 1727203844.45020: done dumping result, returning 13271 1727203844.45039: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [028d2410-947f-2a40-12ba-00000000007d] 13271 1727203844.45047: sending task result for task 028d2410-947f-2a40-12ba-00000000007d 13271 1727203844.45382: done sending task result for task 028d2410-947f-2a40-12ba-00000000007d 13271 1727203844.45386: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: Using network provider: nm 13271 1727203844.45438: no more pending results, returning what we have 13271 1727203844.45440: results queue empty 13271 1727203844.45441: checking for any_errors_fatal 13271 1727203844.45447: done checking for any_errors_fatal 13271 1727203844.45448: checking for max_fail_percentage 13271 1727203844.45450: done checking for max_fail_percentage 13271 1727203844.45450: checking to see if all hosts have failed and the running result is not ok 13271 1727203844.45451: done checking to see if all hosts have failed 13271 1727203844.45452: getting the remaining hosts for this loop 13271 1727203844.45454: done getting the remaining hosts for this loop 13271 1727203844.45458: getting the next task for host managed-node1 13271 1727203844.45463: done getting next task for host managed-node1 13271 1727203844.45468: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13271 1727203844.45472: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203844.45484: getting variables 13271 1727203844.45486: in VariableManager get_vars() 13271 1727203844.45521: Calling all_inventory to load vars for managed-node1 13271 1727203844.45524: Calling groups_inventory to load vars for managed-node1 13271 1727203844.45526: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203844.45534: Calling all_plugins_play to load vars for managed-node1 13271 1727203844.45537: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203844.45540: Calling groups_plugins_play to load vars for managed-node1 13271 1727203844.52057: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203844.53917: done with get_vars() 13271 1727203844.53951: done getting variables 13271 1727203844.54015: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:50:44 -0400 (0:00:00.116) 0:00:28.183 ***** 13271 1727203844.54057: entering _queue_task() for managed-node1/fail 13271 1727203844.54614: worker is 1 (out of 1 available) 13271 1727203844.54629: exiting _queue_task() for managed-node1/fail 13271 1727203844.54639: done queuing things up, now waiting for results queue to drain 13271 1727203844.54641: waiting for pending results... 13271 1727203844.55066: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 13271 1727203844.55549: in run() - task 028d2410-947f-2a40-12ba-00000000007e 13271 1727203844.55628: variable 'ansible_search_path' from source: unknown 13271 1727203844.55632: variable 'ansible_search_path' from source: unknown 13271 1727203844.55636: calling self._execute() 13271 1727203844.56349: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203844.56382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203844.56388: variable 'omit' from source: magic vars 13271 1727203844.56990: variable 'ansible_distribution_major_version' from source: facts 13271 1727203844.57102: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203844.57317: variable 'network_state' from source: role '' defaults 13271 1727203844.57332: Evaluated conditional (network_state != {}): False 13271 1727203844.57429: when evaluation is False, skipping this task 13271 1727203844.57438: _execute() done 13271 1727203844.57445: dumping result to json 13271 1727203844.57452: done dumping result, returning 13271 1727203844.57467: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [028d2410-947f-2a40-12ba-00000000007e] 13271 1727203844.57480: sending task result for task 028d2410-947f-2a40-12ba-00000000007e 13271 1727203844.57682: done sending task result for task 028d2410-947f-2a40-12ba-00000000007e 13271 1727203844.57686: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203844.57737: no more pending results, returning what we have 13271 1727203844.57742: results queue empty 13271 1727203844.57743: checking for any_errors_fatal 13271 1727203844.57749: done checking for any_errors_fatal 13271 1727203844.57750: checking for max_fail_percentage 13271 1727203844.57752: done checking for max_fail_percentage 13271 1727203844.57753: checking to see if all hosts have failed and the running result is not ok 13271 1727203844.57754: done checking to see if all hosts have failed 13271 1727203844.57755: getting the remaining hosts for this loop 13271 1727203844.57756: done getting the remaining hosts for this loop 13271 1727203844.57760: getting the next task for host managed-node1 13271 1727203844.57769: done getting next task for host managed-node1 13271 1727203844.57774: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13271 1727203844.57779: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203844.57802: getting variables 13271 1727203844.57804: in VariableManager get_vars() 13271 1727203844.57848: Calling all_inventory to load vars for managed-node1 13271 1727203844.57851: Calling groups_inventory to load vars for managed-node1 13271 1727203844.57854: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203844.57870: Calling all_plugins_play to load vars for managed-node1 13271 1727203844.57873: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203844.58082: Calling groups_plugins_play to load vars for managed-node1 13271 1727203844.59667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203844.61957: done with get_vars() 13271 1727203844.62191: done getting variables 13271 1727203844.62417: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:50:44 -0400 (0:00:00.083) 0:00:28.267 ***** 13271 1727203844.62454: entering _queue_task() for managed-node1/fail 13271 1727203844.63623: worker is 1 (out of 1 available) 13271 1727203844.63751: exiting _queue_task() for managed-node1/fail 13271 1727203844.63765: done queuing things up, now waiting for results queue to drain 13271 1727203844.63767: waiting for pending results... 13271 1727203844.64120: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 13271 1727203844.64582: in run() - task 028d2410-947f-2a40-12ba-00000000007f 13271 1727203844.64586: variable 'ansible_search_path' from source: unknown 13271 1727203844.64589: variable 'ansible_search_path' from source: unknown 13271 1727203844.64591: calling self._execute() 13271 1727203844.64654: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203844.64725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203844.64739: variable 'omit' from source: magic vars 13271 1727203844.65466: variable 'ansible_distribution_major_version' from source: facts 13271 1727203844.65594: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203844.65814: variable 'network_state' from source: role '' defaults 13271 1727203844.65834: Evaluated conditional (network_state != {}): False 13271 1727203844.65866: when evaluation is False, skipping this task 13271 1727203844.65874: _execute() done 13271 1727203844.65912: dumping result to json 13271 1727203844.65920: done dumping result, returning 13271 1727203844.65932: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [028d2410-947f-2a40-12ba-00000000007f] 13271 1727203844.65944: sending task result for task 028d2410-947f-2a40-12ba-00000000007f 13271 1727203844.66194: done sending task result for task 028d2410-947f-2a40-12ba-00000000007f 13271 1727203844.66198: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203844.66250: no more pending results, returning what we have 13271 1727203844.66254: results queue empty 13271 1727203844.66255: checking for any_errors_fatal 13271 1727203844.66266: done checking for any_errors_fatal 13271 1727203844.66267: checking for max_fail_percentage 13271 1727203844.66269: done checking for max_fail_percentage 13271 1727203844.66270: checking to see if all hosts have failed and the running result is not ok 13271 1727203844.66271: done checking to see if all hosts have failed 13271 1727203844.66272: getting the remaining hosts for this loop 13271 1727203844.66273: done getting the remaining hosts for this loop 13271 1727203844.66279: getting the next task for host managed-node1 13271 1727203844.66287: done getting next task for host managed-node1 13271 1727203844.66290: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13271 1727203844.66294: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203844.66313: getting variables 13271 1727203844.66315: in VariableManager get_vars() 13271 1727203844.66357: Calling all_inventory to load vars for managed-node1 13271 1727203844.66359: Calling groups_inventory to load vars for managed-node1 13271 1727203844.66364: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203844.66579: Calling all_plugins_play to load vars for managed-node1 13271 1727203844.66584: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203844.66588: Calling groups_plugins_play to load vars for managed-node1 13271 1727203844.70116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203844.74588: done with get_vars() 13271 1727203844.74619: done getting variables 13271 1727203844.74682: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:50:44 -0400 (0:00:00.123) 0:00:28.391 ***** 13271 1727203844.74818: entering _queue_task() for managed-node1/fail 13271 1727203844.75577: worker is 1 (out of 1 available) 13271 1727203844.75588: exiting _queue_task() for managed-node1/fail 13271 1727203844.75598: done queuing things up, now waiting for results queue to drain 13271 1727203844.75600: waiting for pending results... 13271 1727203844.76395: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 13271 1727203844.76405: in run() - task 028d2410-947f-2a40-12ba-000000000080 13271 1727203844.76409: variable 'ansible_search_path' from source: unknown 13271 1727203844.76412: variable 'ansible_search_path' from source: unknown 13271 1727203844.76415: calling self._execute() 13271 1727203844.76735: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203844.76743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203844.76756: variable 'omit' from source: magic vars 13271 1727203844.77682: variable 'ansible_distribution_major_version' from source: facts 13271 1727203844.77686: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203844.77934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203844.83998: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203844.84154: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203844.84229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203844.84412: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203844.84442: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203844.84549: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203844.84780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203844.84898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203844.84944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203844.85024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203844.85177: variable 'ansible_distribution_major_version' from source: facts 13271 1727203844.85250: Evaluated conditional (ansible_distribution_major_version | int > 9): True 13271 1727203844.85516: variable 'ansible_distribution' from source: facts 13271 1727203844.85565: variable '__network_rh_distros' from source: role '' defaults 13271 1727203844.85583: Evaluated conditional (ansible_distribution in __network_rh_distros): True 13271 1727203844.86181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203844.86184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203844.86237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203844.86366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203844.86387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203844.86528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203844.86564: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203844.86746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203844.86856: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203844.87281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203844.87284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203844.87287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203844.87289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203844.87338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203844.87400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203844.88130: variable 'network_connections' from source: task vars 13271 1727203844.88134: variable 'port2_profile' from source: play vars 13271 1727203844.88233: variable 'port2_profile' from source: play vars 13271 1727203844.88589: variable 'port1_profile' from source: play vars 13271 1727203844.88592: variable 'port1_profile' from source: play vars 13271 1727203844.88594: variable 'controller_profile' from source: play vars 13271 1727203844.88880: variable 'controller_profile' from source: play vars 13271 1727203844.88883: variable 'network_state' from source: role '' defaults 13271 1727203844.88886: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203844.89143: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203844.89480: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203844.89483: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203844.89486: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203844.89541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203844.89572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203844.89606: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203844.89635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203844.89671: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 13271 1727203844.89886: when evaluation is False, skipping this task 13271 1727203844.90081: _execute() done 13271 1727203844.90084: dumping result to json 13271 1727203844.90088: done dumping result, returning 13271 1727203844.90095: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [028d2410-947f-2a40-12ba-000000000080] 13271 1727203844.90098: sending task result for task 028d2410-947f-2a40-12ba-000000000080 13271 1727203844.90170: done sending task result for task 028d2410-947f-2a40-12ba-000000000080 13271 1727203844.90173: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 13271 1727203844.90248: no more pending results, returning what we have 13271 1727203844.90251: results queue empty 13271 1727203844.90252: checking for any_errors_fatal 13271 1727203844.90259: done checking for any_errors_fatal 13271 1727203844.90260: checking for max_fail_percentage 13271 1727203844.90262: done checking for max_fail_percentage 13271 1727203844.90263: checking to see if all hosts have failed and the running result is not ok 13271 1727203844.90264: done checking to see if all hosts have failed 13271 1727203844.90264: getting the remaining hosts for this loop 13271 1727203844.90266: done getting the remaining hosts for this loop 13271 1727203844.90270: getting the next task for host managed-node1 13271 1727203844.90279: done getting next task for host managed-node1 13271 1727203844.90283: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13271 1727203844.90287: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203844.90305: getting variables 13271 1727203844.90307: in VariableManager get_vars() 13271 1727203844.90348: Calling all_inventory to load vars for managed-node1 13271 1727203844.90351: Calling groups_inventory to load vars for managed-node1 13271 1727203844.90354: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203844.90365: Calling all_plugins_play to load vars for managed-node1 13271 1727203844.90369: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203844.90372: Calling groups_plugins_play to load vars for managed-node1 13271 1727203844.93144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203844.96931: done with get_vars() 13271 1727203844.96956: done getting variables 13271 1727203844.97019: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:50:44 -0400 (0:00:00.222) 0:00:28.613 ***** 13271 1727203844.97058: entering _queue_task() for managed-node1/dnf 13271 1727203844.97512: worker is 1 (out of 1 available) 13271 1727203844.97524: exiting _queue_task() for managed-node1/dnf 13271 1727203844.97534: done queuing things up, now waiting for results queue to drain 13271 1727203844.97536: waiting for pending results... 13271 1727203844.97751: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 13271 1727203844.97880: in run() - task 028d2410-947f-2a40-12ba-000000000081 13271 1727203844.97903: variable 'ansible_search_path' from source: unknown 13271 1727203844.97906: variable 'ansible_search_path' from source: unknown 13271 1727203844.97951: calling self._execute() 13271 1727203844.98143: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203844.98150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203844.98153: variable 'omit' from source: magic vars 13271 1727203844.98474: variable 'ansible_distribution_major_version' from source: facts 13271 1727203844.98487: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203844.98714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203845.01184: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203845.01206: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203845.01242: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203845.01280: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203845.01312: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203845.01386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.01422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.01445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.01582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.01586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.01690: variable 'ansible_distribution' from source: facts 13271 1727203845.01694: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.01696: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 13271 1727203845.01769: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203845.01942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.01946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.01983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.02020: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.02034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.02085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.02121: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.02142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.02195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.02212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.02349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.02352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.02355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.02357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.02403: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.02983: variable 'network_connections' from source: task vars 13271 1727203845.02986: variable 'port2_profile' from source: play vars 13271 1727203845.02989: variable 'port2_profile' from source: play vars 13271 1727203845.02991: variable 'port1_profile' from source: play vars 13271 1727203845.02993: variable 'port1_profile' from source: play vars 13271 1727203845.02995: variable 'controller_profile' from source: play vars 13271 1727203845.02997: variable 'controller_profile' from source: play vars 13271 1727203845.03064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203845.03245: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203845.03281: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203845.03310: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203845.03343: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203845.03385: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203845.03407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203845.03431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.03467: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203845.03515: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203845.03759: variable 'network_connections' from source: task vars 13271 1727203845.03881: variable 'port2_profile' from source: play vars 13271 1727203845.03938: variable 'port2_profile' from source: play vars 13271 1727203845.03945: variable 'port1_profile' from source: play vars 13271 1727203845.04111: variable 'port1_profile' from source: play vars 13271 1727203845.04120: variable 'controller_profile' from source: play vars 13271 1727203845.04181: variable 'controller_profile' from source: play vars 13271 1727203845.04481: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13271 1727203845.04484: when evaluation is False, skipping this task 13271 1727203845.04486: _execute() done 13271 1727203845.04488: dumping result to json 13271 1727203845.04489: done dumping result, returning 13271 1727203845.04491: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [028d2410-947f-2a40-12ba-000000000081] 13271 1727203845.04494: sending task result for task 028d2410-947f-2a40-12ba-000000000081 13271 1727203845.04559: done sending task result for task 028d2410-947f-2a40-12ba-000000000081 13271 1727203845.04565: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13271 1727203845.04633: no more pending results, returning what we have 13271 1727203845.04640: results queue empty 13271 1727203845.04642: checking for any_errors_fatal 13271 1727203845.04651: done checking for any_errors_fatal 13271 1727203845.04652: checking for max_fail_percentage 13271 1727203845.04654: done checking for max_fail_percentage 13271 1727203845.04656: checking to see if all hosts have failed and the running result is not ok 13271 1727203845.04657: done checking to see if all hosts have failed 13271 1727203845.04657: getting the remaining hosts for this loop 13271 1727203845.04659: done getting the remaining hosts for this loop 13271 1727203845.04662: getting the next task for host managed-node1 13271 1727203845.04669: done getting next task for host managed-node1 13271 1727203845.04673: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13271 1727203845.04679: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203845.04700: getting variables 13271 1727203845.04702: in VariableManager get_vars() 13271 1727203845.04864: Calling all_inventory to load vars for managed-node1 13271 1727203845.04868: Calling groups_inventory to load vars for managed-node1 13271 1727203845.04870: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203845.04881: Calling all_plugins_play to load vars for managed-node1 13271 1727203845.04885: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203845.04888: Calling groups_plugins_play to load vars for managed-node1 13271 1727203845.06469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203845.09016: done with get_vars() 13271 1727203845.09044: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 13271 1727203845.09132: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:50:45 -0400 (0:00:00.121) 0:00:28.734 ***** 13271 1727203845.09168: entering _queue_task() for managed-node1/yum 13271 1727203845.09550: worker is 1 (out of 1 available) 13271 1727203845.09678: exiting _queue_task() for managed-node1/yum 13271 1727203845.09689: done queuing things up, now waiting for results queue to drain 13271 1727203845.09691: waiting for pending results... 13271 1727203845.10030: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 13271 1727203845.10102: in run() - task 028d2410-947f-2a40-12ba-000000000082 13271 1727203845.10107: variable 'ansible_search_path' from source: unknown 13271 1727203845.10110: variable 'ansible_search_path' from source: unknown 13271 1727203845.10161: calling self._execute() 13271 1727203845.10275: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.10281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.10284: variable 'omit' from source: magic vars 13271 1727203845.10714: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.10820: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203845.10933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203845.15665: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203845.15928: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203845.15932: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203845.15935: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203845.15937: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203845.15956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.15994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.16025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.16073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.16089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.16242: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.16344: Evaluated conditional (ansible_distribution_major_version | int < 8): False 13271 1727203845.16348: when evaluation is False, skipping this task 13271 1727203845.16351: _execute() done 13271 1727203845.16360: dumping result to json 13271 1727203845.16363: done dumping result, returning 13271 1727203845.16369: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [028d2410-947f-2a40-12ba-000000000082] 13271 1727203845.16374: sending task result for task 028d2410-947f-2a40-12ba-000000000082 13271 1727203845.16600: done sending task result for task 028d2410-947f-2a40-12ba-000000000082 13271 1727203845.16603: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 13271 1727203845.16826: no more pending results, returning what we have 13271 1727203845.16829: results queue empty 13271 1727203845.16830: checking for any_errors_fatal 13271 1727203845.16835: done checking for any_errors_fatal 13271 1727203845.16836: checking for max_fail_percentage 13271 1727203845.16838: done checking for max_fail_percentage 13271 1727203845.16839: checking to see if all hosts have failed and the running result is not ok 13271 1727203845.16840: done checking to see if all hosts have failed 13271 1727203845.16841: getting the remaining hosts for this loop 13271 1727203845.16842: done getting the remaining hosts for this loop 13271 1727203845.16845: getting the next task for host managed-node1 13271 1727203845.16851: done getting next task for host managed-node1 13271 1727203845.16856: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13271 1727203845.16863: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203845.16884: getting variables 13271 1727203845.16886: in VariableManager get_vars() 13271 1727203845.16924: Calling all_inventory to load vars for managed-node1 13271 1727203845.16927: Calling groups_inventory to load vars for managed-node1 13271 1727203845.16929: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203845.16938: Calling all_plugins_play to load vars for managed-node1 13271 1727203845.16941: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203845.16943: Calling groups_plugins_play to load vars for managed-node1 13271 1727203845.19933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203845.22525: done with get_vars() 13271 1727203845.22551: done getting variables 13271 1727203845.22623: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:50:45 -0400 (0:00:00.134) 0:00:28.869 ***** 13271 1727203845.22672: entering _queue_task() for managed-node1/fail 13271 1727203845.23022: worker is 1 (out of 1 available) 13271 1727203845.23038: exiting _queue_task() for managed-node1/fail 13271 1727203845.23049: done queuing things up, now waiting for results queue to drain 13271 1727203845.23051: waiting for pending results... 13271 1727203845.23241: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 13271 1727203845.23344: in run() - task 028d2410-947f-2a40-12ba-000000000083 13271 1727203845.23354: variable 'ansible_search_path' from source: unknown 13271 1727203845.23358: variable 'ansible_search_path' from source: unknown 13271 1727203845.23394: calling self._execute() 13271 1727203845.23474: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.23480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.23490: variable 'omit' from source: magic vars 13271 1727203845.23765: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.23778: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203845.23857: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203845.23995: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203845.26081: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203845.26134: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203845.26162: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203845.26190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203845.26210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203845.26273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.26297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.26314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.26340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.26352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.26391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.26406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.26423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.26446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.26459: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.26493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.26508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.26524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.26547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.26559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.26683: variable 'network_connections' from source: task vars 13271 1727203845.26694: variable 'port2_profile' from source: play vars 13271 1727203845.26749: variable 'port2_profile' from source: play vars 13271 1727203845.26758: variable 'port1_profile' from source: play vars 13271 1727203845.26805: variable 'port1_profile' from source: play vars 13271 1727203845.26812: variable 'controller_profile' from source: play vars 13271 1727203845.26854: variable 'controller_profile' from source: play vars 13271 1727203845.26907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203845.27032: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203845.27059: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203845.27088: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203845.27122: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203845.27156: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203845.27182: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203845.27259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.27265: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203845.27313: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203845.27506: variable 'network_connections' from source: task vars 13271 1727203845.27532: variable 'port2_profile' from source: play vars 13271 1727203845.27567: variable 'port2_profile' from source: play vars 13271 1727203845.27611: variable 'port1_profile' from source: play vars 13271 1727203845.27639: variable 'port1_profile' from source: play vars 13271 1727203845.27642: variable 'controller_profile' from source: play vars 13271 1727203845.27735: variable 'controller_profile' from source: play vars 13271 1727203845.27920: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13271 1727203845.27929: when evaluation is False, skipping this task 13271 1727203845.27931: _execute() done 13271 1727203845.27934: dumping result to json 13271 1727203845.27935: done dumping result, returning 13271 1727203845.27937: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [028d2410-947f-2a40-12ba-000000000083] 13271 1727203845.27939: sending task result for task 028d2410-947f-2a40-12ba-000000000083 13271 1727203845.28019: done sending task result for task 028d2410-947f-2a40-12ba-000000000083 13271 1727203845.28022: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13271 1727203845.28087: no more pending results, returning what we have 13271 1727203845.28091: results queue empty 13271 1727203845.28092: checking for any_errors_fatal 13271 1727203845.28099: done checking for any_errors_fatal 13271 1727203845.28100: checking for max_fail_percentage 13271 1727203845.28102: done checking for max_fail_percentage 13271 1727203845.28103: checking to see if all hosts have failed and the running result is not ok 13271 1727203845.28104: done checking to see if all hosts have failed 13271 1727203845.28105: getting the remaining hosts for this loop 13271 1727203845.28106: done getting the remaining hosts for this loop 13271 1727203845.28110: getting the next task for host managed-node1 13271 1727203845.28118: done getting next task for host managed-node1 13271 1727203845.28122: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 13271 1727203845.28126: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203845.28153: getting variables 13271 1727203845.28155: in VariableManager get_vars() 13271 1727203845.28213: Calling all_inventory to load vars for managed-node1 13271 1727203845.28216: Calling groups_inventory to load vars for managed-node1 13271 1727203845.28218: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203845.28228: Calling all_plugins_play to load vars for managed-node1 13271 1727203845.28231: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203845.28233: Calling groups_plugins_play to load vars for managed-node1 13271 1727203845.29527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203845.30667: done with get_vars() 13271 1727203845.30692: done getting variables 13271 1727203845.30762: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:50:45 -0400 (0:00:00.081) 0:00:28.951 ***** 13271 1727203845.30803: entering _queue_task() for managed-node1/package 13271 1727203845.31168: worker is 1 (out of 1 available) 13271 1727203845.31381: exiting _queue_task() for managed-node1/package 13271 1727203845.31398: done queuing things up, now waiting for results queue to drain 13271 1727203845.31400: waiting for pending results... 13271 1727203845.31670: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 13271 1727203845.31677: in run() - task 028d2410-947f-2a40-12ba-000000000084 13271 1727203845.31681: variable 'ansible_search_path' from source: unknown 13271 1727203845.31683: variable 'ansible_search_path' from source: unknown 13271 1727203845.31716: calling self._execute() 13271 1727203845.31847: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.31851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.31855: variable 'omit' from source: magic vars 13271 1727203845.32208: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.32222: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203845.32481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203845.32717: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203845.32763: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203845.32805: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203845.32973: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203845.32999: variable 'network_packages' from source: role '' defaults 13271 1727203845.33097: variable '__network_provider_setup' from source: role '' defaults 13271 1727203845.33113: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203845.33183: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203845.33194: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203845.33295: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203845.33423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203845.35387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203845.35430: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203845.35457: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203845.35486: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203845.35505: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203845.35581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.35624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.35722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.35726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.35728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.35730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.35752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.35848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.35862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.35882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.36050: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13271 1727203845.36161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.36192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.36215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.36251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.36269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.36388: variable 'ansible_python' from source: facts 13271 1727203845.36391: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13271 1727203845.36462: variable '__network_wpa_supplicant_required' from source: role '' defaults 13271 1727203845.36549: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13271 1727203845.36688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.36738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.36741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.36814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.36817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.36939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.36951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.36954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.36956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.36958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.37136: variable 'network_connections' from source: task vars 13271 1727203845.37140: variable 'port2_profile' from source: play vars 13271 1727203845.37236: variable 'port2_profile' from source: play vars 13271 1727203845.37241: variable 'port1_profile' from source: play vars 13271 1727203845.37342: variable 'port1_profile' from source: play vars 13271 1727203845.37345: variable 'controller_profile' from source: play vars 13271 1727203845.37439: variable 'controller_profile' from source: play vars 13271 1727203845.37494: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203845.37541: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203845.37563: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.37588: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203845.37629: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203845.37852: variable 'network_connections' from source: task vars 13271 1727203845.37874: variable 'port2_profile' from source: play vars 13271 1727203845.37940: variable 'port2_profile' from source: play vars 13271 1727203845.37948: variable 'port1_profile' from source: play vars 13271 1727203845.38038: variable 'port1_profile' from source: play vars 13271 1727203845.38041: variable 'controller_profile' from source: play vars 13271 1727203845.38128: variable 'controller_profile' from source: play vars 13271 1727203845.38144: variable '__network_packages_default_wireless' from source: role '' defaults 13271 1727203845.38209: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203845.38457: variable 'network_connections' from source: task vars 13271 1727203845.38460: variable 'port2_profile' from source: play vars 13271 1727203845.38521: variable 'port2_profile' from source: play vars 13271 1727203845.38524: variable 'port1_profile' from source: play vars 13271 1727203845.38572: variable 'port1_profile' from source: play vars 13271 1727203845.38580: variable 'controller_profile' from source: play vars 13271 1727203845.38636: variable 'controller_profile' from source: play vars 13271 1727203845.38655: variable '__network_packages_default_team' from source: role '' defaults 13271 1727203845.38716: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203845.38938: variable 'network_connections' from source: task vars 13271 1727203845.38943: variable 'port2_profile' from source: play vars 13271 1727203845.38994: variable 'port2_profile' from source: play vars 13271 1727203845.39007: variable 'port1_profile' from source: play vars 13271 1727203845.39056: variable 'port1_profile' from source: play vars 13271 1727203845.39067: variable 'controller_profile' from source: play vars 13271 1727203845.39123: variable 'controller_profile' from source: play vars 13271 1727203845.39167: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203845.39225: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203845.39229: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203845.39274: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203845.39428: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13271 1727203845.39897: variable 'network_connections' from source: task vars 13271 1727203845.39945: variable 'port2_profile' from source: play vars 13271 1727203845.39979: variable 'port2_profile' from source: play vars 13271 1727203845.39986: variable 'port1_profile' from source: play vars 13271 1727203845.40177: variable 'port1_profile' from source: play vars 13271 1727203845.40181: variable 'controller_profile' from source: play vars 13271 1727203845.40183: variable 'controller_profile' from source: play vars 13271 1727203845.40185: variable 'ansible_distribution' from source: facts 13271 1727203845.40187: variable '__network_rh_distros' from source: role '' defaults 13271 1727203845.40227: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.40270: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13271 1727203845.40536: variable 'ansible_distribution' from source: facts 13271 1727203845.40539: variable '__network_rh_distros' from source: role '' defaults 13271 1727203845.40546: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.40565: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13271 1727203845.40886: variable 'ansible_distribution' from source: facts 13271 1727203845.40890: variable '__network_rh_distros' from source: role '' defaults 13271 1727203845.40893: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.40930: variable 'network_provider' from source: set_fact 13271 1727203845.40941: variable 'ansible_facts' from source: unknown 13271 1727203845.41372: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 13271 1727203845.41380: when evaluation is False, skipping this task 13271 1727203845.41383: _execute() done 13271 1727203845.41385: dumping result to json 13271 1727203845.41389: done dumping result, returning 13271 1727203845.41400: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [028d2410-947f-2a40-12ba-000000000084] 13271 1727203845.41402: sending task result for task 028d2410-947f-2a40-12ba-000000000084 13271 1727203845.41523: done sending task result for task 028d2410-947f-2a40-12ba-000000000084 13271 1727203845.41525: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 13271 1727203845.41592: no more pending results, returning what we have 13271 1727203845.41595: results queue empty 13271 1727203845.41596: checking for any_errors_fatal 13271 1727203845.41601: done checking for any_errors_fatal 13271 1727203845.41602: checking for max_fail_percentage 13271 1727203845.41604: done checking for max_fail_percentage 13271 1727203845.41605: checking to see if all hosts have failed and the running result is not ok 13271 1727203845.41606: done checking to see if all hosts have failed 13271 1727203845.41606: getting the remaining hosts for this loop 13271 1727203845.41608: done getting the remaining hosts for this loop 13271 1727203845.41615: getting the next task for host managed-node1 13271 1727203845.41622: done getting next task for host managed-node1 13271 1727203845.41626: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13271 1727203845.41629: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203845.41649: getting variables 13271 1727203845.41651: in VariableManager get_vars() 13271 1727203845.41691: Calling all_inventory to load vars for managed-node1 13271 1727203845.41694: Calling groups_inventory to load vars for managed-node1 13271 1727203845.41696: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203845.41705: Calling all_plugins_play to load vars for managed-node1 13271 1727203845.41708: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203845.41710: Calling groups_plugins_play to load vars for managed-node1 13271 1727203845.42668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203845.43623: done with get_vars() 13271 1727203845.43651: done getting variables 13271 1727203845.43712: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:50:45 -0400 (0:00:00.129) 0:00:29.080 ***** 13271 1727203845.43750: entering _queue_task() for managed-node1/package 13271 1727203845.44283: worker is 1 (out of 1 available) 13271 1727203845.44296: exiting _queue_task() for managed-node1/package 13271 1727203845.44307: done queuing things up, now waiting for results queue to drain 13271 1727203845.44309: waiting for pending results... 13271 1727203845.44866: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 13271 1727203845.44871: in run() - task 028d2410-947f-2a40-12ba-000000000085 13271 1727203845.44893: variable 'ansible_search_path' from source: unknown 13271 1727203845.44897: variable 'ansible_search_path' from source: unknown 13271 1727203845.44939: calling self._execute() 13271 1727203845.45106: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.45139: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.45142: variable 'omit' from source: magic vars 13271 1727203845.45728: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.45799: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203845.45922: variable 'network_state' from source: role '' defaults 13271 1727203845.45939: Evaluated conditional (network_state != {}): False 13271 1727203845.45996: when evaluation is False, skipping this task 13271 1727203845.46004: _execute() done 13271 1727203845.46010: dumping result to json 13271 1727203845.46013: done dumping result, returning 13271 1727203845.46015: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [028d2410-947f-2a40-12ba-000000000085] 13271 1727203845.46018: sending task result for task 028d2410-947f-2a40-12ba-000000000085 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203845.46503: no more pending results, returning what we have 13271 1727203845.46506: results queue empty 13271 1727203845.46507: checking for any_errors_fatal 13271 1727203845.46513: done checking for any_errors_fatal 13271 1727203845.46514: checking for max_fail_percentage 13271 1727203845.46516: done checking for max_fail_percentage 13271 1727203845.46517: checking to see if all hosts have failed and the running result is not ok 13271 1727203845.46518: done checking to see if all hosts have failed 13271 1727203845.46519: getting the remaining hosts for this loop 13271 1727203845.46521: done getting the remaining hosts for this loop 13271 1727203845.46524: getting the next task for host managed-node1 13271 1727203845.46532: done getting next task for host managed-node1 13271 1727203845.46536: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13271 1727203845.46541: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203845.46570: getting variables 13271 1727203845.46574: in VariableManager get_vars() 13271 1727203845.46618: Calling all_inventory to load vars for managed-node1 13271 1727203845.46621: Calling groups_inventory to load vars for managed-node1 13271 1727203845.46623: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203845.46640: Calling all_plugins_play to load vars for managed-node1 13271 1727203845.46644: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203845.46647: Calling groups_plugins_play to load vars for managed-node1 13271 1727203845.47700: done sending task result for task 028d2410-947f-2a40-12ba-000000000085 13271 1727203845.47705: WORKER PROCESS EXITING 13271 1727203845.47857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203845.48864: done with get_vars() 13271 1727203845.48884: done getting variables 13271 1727203845.48929: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:50:45 -0400 (0:00:00.052) 0:00:29.132 ***** 13271 1727203845.48953: entering _queue_task() for managed-node1/package 13271 1727203845.49201: worker is 1 (out of 1 available) 13271 1727203845.49213: exiting _queue_task() for managed-node1/package 13271 1727203845.49224: done queuing things up, now waiting for results queue to drain 13271 1727203845.49225: waiting for pending results... 13271 1727203845.49445: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 13271 1727203845.49565: in run() - task 028d2410-947f-2a40-12ba-000000000086 13271 1727203845.49581: variable 'ansible_search_path' from source: unknown 13271 1727203845.49584: variable 'ansible_search_path' from source: unknown 13271 1727203845.49617: calling self._execute() 13271 1727203845.49691: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.49704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.49712: variable 'omit' from source: magic vars 13271 1727203845.50081: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.50084: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203845.50151: variable 'network_state' from source: role '' defaults 13271 1727203845.50171: Evaluated conditional (network_state != {}): False 13271 1727203845.50182: when evaluation is False, skipping this task 13271 1727203845.50190: _execute() done 13271 1727203845.50197: dumping result to json 13271 1727203845.50205: done dumping result, returning 13271 1727203845.50215: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [028d2410-947f-2a40-12ba-000000000086] 13271 1727203845.50225: sending task result for task 028d2410-947f-2a40-12ba-000000000086 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203845.50392: no more pending results, returning what we have 13271 1727203845.50396: results queue empty 13271 1727203845.50397: checking for any_errors_fatal 13271 1727203845.50404: done checking for any_errors_fatal 13271 1727203845.50405: checking for max_fail_percentage 13271 1727203845.50407: done checking for max_fail_percentage 13271 1727203845.50408: checking to see if all hosts have failed and the running result is not ok 13271 1727203845.50409: done checking to see if all hosts have failed 13271 1727203845.50410: getting the remaining hosts for this loop 13271 1727203845.50411: done getting the remaining hosts for this loop 13271 1727203845.50415: getting the next task for host managed-node1 13271 1727203845.50422: done getting next task for host managed-node1 13271 1727203845.50426: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13271 1727203845.50430: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203845.50450: getting variables 13271 1727203845.50452: in VariableManager get_vars() 13271 1727203845.50498: Calling all_inventory to load vars for managed-node1 13271 1727203845.50501: Calling groups_inventory to load vars for managed-node1 13271 1727203845.50504: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203845.50517: Calling all_plugins_play to load vars for managed-node1 13271 1727203845.50521: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203845.50524: Calling groups_plugins_play to load vars for managed-node1 13271 1727203845.51350: done sending task result for task 028d2410-947f-2a40-12ba-000000000086 13271 1727203845.51354: WORKER PROCESS EXITING 13271 1727203845.51841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203845.53099: done with get_vars() 13271 1727203845.53128: done getting variables 13271 1727203845.53190: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:50:45 -0400 (0:00:00.042) 0:00:29.175 ***** 13271 1727203845.53221: entering _queue_task() for managed-node1/service 13271 1727203845.53469: worker is 1 (out of 1 available) 13271 1727203845.53484: exiting _queue_task() for managed-node1/service 13271 1727203845.53494: done queuing things up, now waiting for results queue to drain 13271 1727203845.53496: waiting for pending results... 13271 1727203845.53677: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 13271 1727203845.53837: in run() - task 028d2410-947f-2a40-12ba-000000000087 13271 1727203845.53842: variable 'ansible_search_path' from source: unknown 13271 1727203845.53844: variable 'ansible_search_path' from source: unknown 13271 1727203845.53847: calling self._execute() 13271 1727203845.53930: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.53934: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.54009: variable 'omit' from source: magic vars 13271 1727203845.55283: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.55287: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203845.55662: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203845.56349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203845.59064: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203845.59682: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203845.59686: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203845.59689: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203845.59691: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203845.60073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.60194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.60219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.60282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.60287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.60561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.60565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.60569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.60572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.60574: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.60579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.60581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.60584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.60701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.60704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.61084: variable 'network_connections' from source: task vars 13271 1727203845.61192: variable 'port2_profile' from source: play vars 13271 1727203845.61295: variable 'port2_profile' from source: play vars 13271 1727203845.61380: variable 'port1_profile' from source: play vars 13271 1727203845.61383: variable 'port1_profile' from source: play vars 13271 1727203845.61386: variable 'controller_profile' from source: play vars 13271 1727203845.61430: variable 'controller_profile' from source: play vars 13271 1727203845.61729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203845.62393: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203845.62397: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203845.62399: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203845.62425: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203845.62474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203845.62781: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203845.62785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.62787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203845.62789: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203845.63490: variable 'network_connections' from source: task vars 13271 1727203845.63495: variable 'port2_profile' from source: play vars 13271 1727203845.63554: variable 'port2_profile' from source: play vars 13271 1727203845.63564: variable 'port1_profile' from source: play vars 13271 1727203845.63900: variable 'port1_profile' from source: play vars 13271 1727203845.63907: variable 'controller_profile' from source: play vars 13271 1727203845.63966: variable 'controller_profile' from source: play vars 13271 1727203845.64002: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 13271 1727203845.64015: when evaluation is False, skipping this task 13271 1727203845.64020: _execute() done 13271 1727203845.64023: dumping result to json 13271 1727203845.64027: done dumping result, returning 13271 1727203845.64029: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [028d2410-947f-2a40-12ba-000000000087] 13271 1727203845.64031: sending task result for task 028d2410-947f-2a40-12ba-000000000087 13271 1727203845.64215: done sending task result for task 028d2410-947f-2a40-12ba-000000000087 13271 1727203845.64217: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 13271 1727203845.64272: no more pending results, returning what we have 13271 1727203845.64277: results queue empty 13271 1727203845.64278: checking for any_errors_fatal 13271 1727203845.64284: done checking for any_errors_fatal 13271 1727203845.64284: checking for max_fail_percentage 13271 1727203845.64286: done checking for max_fail_percentage 13271 1727203845.64287: checking to see if all hosts have failed and the running result is not ok 13271 1727203845.64288: done checking to see if all hosts have failed 13271 1727203845.64289: getting the remaining hosts for this loop 13271 1727203845.64290: done getting the remaining hosts for this loop 13271 1727203845.64293: getting the next task for host managed-node1 13271 1727203845.64301: done getting next task for host managed-node1 13271 1727203845.64304: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13271 1727203845.64309: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203845.64328: getting variables 13271 1727203845.64330: in VariableManager get_vars() 13271 1727203845.64379: Calling all_inventory to load vars for managed-node1 13271 1727203845.64382: Calling groups_inventory to load vars for managed-node1 13271 1727203845.64384: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203845.64395: Calling all_plugins_play to load vars for managed-node1 13271 1727203845.64398: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203845.64400: Calling groups_plugins_play to load vars for managed-node1 13271 1727203845.67863: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203845.70655: done with get_vars() 13271 1727203845.70701: done getting variables 13271 1727203845.70824: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:50:45 -0400 (0:00:00.176) 0:00:29.352 ***** 13271 1727203845.70918: entering _queue_task() for managed-node1/service 13271 1727203845.71804: worker is 1 (out of 1 available) 13271 1727203845.71815: exiting _queue_task() for managed-node1/service 13271 1727203845.71827: done queuing things up, now waiting for results queue to drain 13271 1727203845.71828: waiting for pending results... 13271 1727203845.72147: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 13271 1727203845.72271: in run() - task 028d2410-947f-2a40-12ba-000000000088 13271 1727203845.72294: variable 'ansible_search_path' from source: unknown 13271 1727203845.72301: variable 'ansible_search_path' from source: unknown 13271 1727203845.72355: calling self._execute() 13271 1727203845.72466: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.72537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.72543: variable 'omit' from source: magic vars 13271 1727203845.72989: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.73005: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203845.73193: variable 'network_provider' from source: set_fact 13271 1727203845.73301: variable 'network_state' from source: role '' defaults 13271 1727203845.73304: Evaluated conditional (network_provider == "nm" or network_state != {}): True 13271 1727203845.73306: variable 'omit' from source: magic vars 13271 1727203845.73308: variable 'omit' from source: magic vars 13271 1727203845.73330: variable 'network_service_name' from source: role '' defaults 13271 1727203845.73412: variable 'network_service_name' from source: role '' defaults 13271 1727203845.73532: variable '__network_provider_setup' from source: role '' defaults 13271 1727203845.73542: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203845.73608: variable '__network_service_name_default_nm' from source: role '' defaults 13271 1727203845.73629: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203845.73706: variable '__network_packages_default_nm' from source: role '' defaults 13271 1727203845.73982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203845.78827: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203845.79010: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203845.79264: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203845.79268: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203845.79270: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203845.79446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.79517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.79616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.79663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.79717: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.79834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.80028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.80031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.80034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.80142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.80636: variable '__network_packages_default_gobject_packages' from source: role '' defaults 13271 1727203845.80928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.80957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.81038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.81124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.81157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.81359: variable 'ansible_python' from source: facts 13271 1727203845.81458: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 13271 1727203845.81882: variable '__network_wpa_supplicant_required' from source: role '' defaults 13271 1727203845.81885: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13271 1727203845.81920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.82016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.82045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.82197: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.82225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.82332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203845.82372: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203845.82468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.82515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203845.82600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203845.83083: variable 'network_connections' from source: task vars 13271 1727203845.83087: variable 'port2_profile' from source: play vars 13271 1727203845.83089: variable 'port2_profile' from source: play vars 13271 1727203845.83091: variable 'port1_profile' from source: play vars 13271 1727203845.83231: variable 'port1_profile' from source: play vars 13271 1727203845.83248: variable 'controller_profile' from source: play vars 13271 1727203845.83407: variable 'controller_profile' from source: play vars 13271 1727203845.83469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203845.83712: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203845.83783: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203845.83830: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203845.83894: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203845.83975: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203845.84012: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203845.84049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203845.84105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203845.84165: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203845.84483: variable 'network_connections' from source: task vars 13271 1727203845.84511: variable 'port2_profile' from source: play vars 13271 1727203845.84583: variable 'port2_profile' from source: play vars 13271 1727203845.84618: variable 'port1_profile' from source: play vars 13271 1727203845.84685: variable 'port1_profile' from source: play vars 13271 1727203845.84708: variable 'controller_profile' from source: play vars 13271 1727203845.84817: variable 'controller_profile' from source: play vars 13271 1727203845.84822: variable '__network_packages_default_wireless' from source: role '' defaults 13271 1727203845.84911: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203845.85241: variable 'network_connections' from source: task vars 13271 1727203845.85257: variable 'port2_profile' from source: play vars 13271 1727203845.85338: variable 'port2_profile' from source: play vars 13271 1727203845.85363: variable 'port1_profile' from source: play vars 13271 1727203845.85470: variable 'port1_profile' from source: play vars 13271 1727203845.85473: variable 'controller_profile' from source: play vars 13271 1727203845.85535: variable 'controller_profile' from source: play vars 13271 1727203845.85564: variable '__network_packages_default_team' from source: role '' defaults 13271 1727203845.85688: variable '__network_team_connections_defined' from source: role '' defaults 13271 1727203845.85988: variable 'network_connections' from source: task vars 13271 1727203845.85998: variable 'port2_profile' from source: play vars 13271 1727203845.86087: variable 'port2_profile' from source: play vars 13271 1727203845.86099: variable 'port1_profile' from source: play vars 13271 1727203845.86185: variable 'port1_profile' from source: play vars 13271 1727203845.86230: variable 'controller_profile' from source: play vars 13271 1727203845.86282: variable 'controller_profile' from source: play vars 13271 1727203845.86357: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203845.86424: variable '__network_service_name_default_initscripts' from source: role '' defaults 13271 1727203845.86445: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203845.86555: variable '__network_packages_default_initscripts' from source: role '' defaults 13271 1727203845.87004: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 13271 1727203845.87605: variable 'network_connections' from source: task vars 13271 1727203845.87614: variable 'port2_profile' from source: play vars 13271 1727203845.87691: variable 'port2_profile' from source: play vars 13271 1727203845.87703: variable 'port1_profile' from source: play vars 13271 1727203845.87767: variable 'port1_profile' from source: play vars 13271 1727203845.87868: variable 'controller_profile' from source: play vars 13271 1727203845.87871: variable 'controller_profile' from source: play vars 13271 1727203845.87873: variable 'ansible_distribution' from source: facts 13271 1727203845.87876: variable '__network_rh_distros' from source: role '' defaults 13271 1727203845.87879: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.87903: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 13271 1727203845.88188: variable 'ansible_distribution' from source: facts 13271 1727203845.88192: variable '__network_rh_distros' from source: role '' defaults 13271 1727203845.88194: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.88197: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 13271 1727203845.88367: variable 'ansible_distribution' from source: facts 13271 1727203845.88380: variable '__network_rh_distros' from source: role '' defaults 13271 1727203845.88399: variable 'ansible_distribution_major_version' from source: facts 13271 1727203845.88456: variable 'network_provider' from source: set_fact 13271 1727203845.88535: variable 'omit' from source: magic vars 13271 1727203845.88538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203845.88571: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203845.88600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203845.88623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203845.88652: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203845.88755: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203845.88758: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.88763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.88820: Set connection var ansible_connection to ssh 13271 1727203845.88834: Set connection var ansible_shell_type to sh 13271 1727203845.88848: Set connection var ansible_timeout to 10 13271 1727203845.88872: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203845.88886: Set connection var ansible_pipelining to False 13271 1727203845.88896: Set connection var ansible_shell_executable to /bin/sh 13271 1727203845.88927: variable 'ansible_shell_executable' from source: unknown 13271 1727203845.88935: variable 'ansible_connection' from source: unknown 13271 1727203845.88973: variable 'ansible_module_compression' from source: unknown 13271 1727203845.88978: variable 'ansible_shell_type' from source: unknown 13271 1727203845.88980: variable 'ansible_shell_executable' from source: unknown 13271 1727203845.88983: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203845.88985: variable 'ansible_pipelining' from source: unknown 13271 1727203845.88986: variable 'ansible_timeout' from source: unknown 13271 1727203845.88988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203845.89102: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203845.89191: variable 'omit' from source: magic vars 13271 1727203845.89194: starting attempt loop 13271 1727203845.89196: running the handler 13271 1727203845.89220: variable 'ansible_facts' from source: unknown 13271 1727203845.90072: _low_level_execute_command(): starting 13271 1727203845.90088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203845.91237: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203845.91493: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203845.91496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203845.91499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203845.91563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203845.91701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203845.93500: stdout chunk (state=3): >>>/root <<< 13271 1727203845.93588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203845.93622: stderr chunk (state=3): >>><<< 13271 1727203845.93627: stdout chunk (state=3): >>><<< 13271 1727203845.93646: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203845.93656: _low_level_execute_command(): starting 13271 1727203845.93665: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362 `" && echo ansible-tmp-1727203845.9364634-15193-2790887167362="` echo /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362 `" ) && sleep 0' 13271 1727203845.94107: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203845.94110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203845.94113: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203845.94115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found <<< 13271 1727203845.94117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203845.94164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203845.94167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203845.94256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203845.96585: stdout chunk (state=3): >>>ansible-tmp-1727203845.9364634-15193-2790887167362=/root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362 <<< 13271 1727203845.96588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203845.96591: stdout chunk (state=3): >>><<< 13271 1727203845.96593: stderr chunk (state=3): >>><<< 13271 1727203845.96595: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203845.9364634-15193-2790887167362=/root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203845.96602: variable 'ansible_module_compression' from source: unknown 13271 1727203845.96643: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 13271 1727203845.96855: variable 'ansible_facts' from source: unknown 13271 1727203845.97110: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/AnsiballZ_systemd.py 13271 1727203845.97223: Sending initial data 13271 1727203845.97228: Sent initial data (154 bytes) 13271 1727203845.97890: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203845.97933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203845.97951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203845.97964: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203845.98096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203845.99890: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203845.99971: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203846.00049: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpth54o96r /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/AnsiballZ_systemd.py <<< 13271 1727203846.00053: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/AnsiballZ_systemd.py" <<< 13271 1727203846.00178: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpth54o96r" to remote "/root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/AnsiballZ_systemd.py" <<< 13271 1727203846.01550: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203846.01597: stderr chunk (state=3): >>><<< 13271 1727203846.01607: stdout chunk (state=3): >>><<< 13271 1727203846.01609: done transferring module to remote 13271 1727203846.01618: _low_level_execute_command(): starting 13271 1727203846.01623: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/ /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/AnsiballZ_systemd.py && sleep 0' 13271 1727203846.02191: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203846.02224: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203846.02241: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203846.02265: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203846.02407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203846.04372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203846.04399: stderr chunk (state=3): >>><<< 13271 1727203846.04402: stdout chunk (state=3): >>><<< 13271 1727203846.04413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203846.04416: _low_level_execute_command(): starting 13271 1727203846.04421: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/AnsiballZ_systemd.py && sleep 0' 13271 1727203846.05032: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203846.05036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203846.05038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration <<< 13271 1727203846.05044: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203846.05047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203846.05084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203846.05209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203846.05266: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203846.36661: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10489856", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3282644992", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "435250000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 13271 1727203846.36705: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 13271 1727203846.39146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203846.39151: stdout chunk (state=3): >>><<< 13271 1727203846.39155: stderr chunk (state=3): >>><<< 13271 1727203846.39381: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "705", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainStartTimestampMonotonic": "33322039", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ExecMainHandoffTimestampMonotonic": "33336258", "ExecMainPID": "705", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10489856", "MemoryPeak": "13869056", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3282644992", "EffectiveMemoryMax": "3702874112", "EffectiveMemoryHigh": "3702874112", "CPUUsageNSec": "435250000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target system.slice dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service cloud-init.service network.target shutdown.target multi-user.target", "After": "network-pre.target sysinit.target system.slice basic.target dbus.socket systemd-journald.socket cloud-init-local.service dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:50:22 EDT", "StateChangeTimestampMonotonic": "413618667", "InactiveExitTimestamp": "Tue 2024-09-24 14:44:02 EDT", "InactiveExitTimestampMonotonic": "33322542", "ActiveEnterTimestamp": "Tue 2024-09-24 14:44:03 EDT", "ActiveEnterTimestampMonotonic": "34680535", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:44:02 EDT", "ConditionTimestampMonotonic": "33321151", "AssertTimestamp": "Tue 2024-09-24 14:44:02 EDT", "AssertTimestampMonotonic": "33321155", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "53c91cc8356748b484feba73dc5ee144", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203846.39393: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203846.39395: _low_level_execute_command(): starting 13271 1727203846.39398: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203845.9364634-15193-2790887167362/ > /dev/null 2>&1 && sleep 0' 13271 1727203846.39985: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203846.39999: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203846.40015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203846.40035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203846.40054: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203846.40090: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203846.40104: stderr chunk (state=3): >>>debug2: match found <<< 13271 1727203846.40190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203846.40321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203846.40727: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203846.42519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203846.42529: stdout chunk (state=3): >>><<< 13271 1727203846.42541: stderr chunk (state=3): >>><<< 13271 1727203846.42560: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203846.42578: handler run complete 13271 1727203846.42644: attempt loop complete, returning result 13271 1727203846.42652: _execute() done 13271 1727203846.42665: dumping result to json 13271 1727203846.42692: done dumping result, returning 13271 1727203846.42707: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [028d2410-947f-2a40-12ba-000000000088] 13271 1727203846.42716: sending task result for task 028d2410-947f-2a40-12ba-000000000088 ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203846.43069: no more pending results, returning what we have 13271 1727203846.43072: results queue empty 13271 1727203846.43073: checking for any_errors_fatal 13271 1727203846.43080: done checking for any_errors_fatal 13271 1727203846.43081: checking for max_fail_percentage 13271 1727203846.43083: done checking for max_fail_percentage 13271 1727203846.43083: checking to see if all hosts have failed and the running result is not ok 13271 1727203846.43084: done checking to see if all hosts have failed 13271 1727203846.43085: getting the remaining hosts for this loop 13271 1727203846.43086: done getting the remaining hosts for this loop 13271 1727203846.43089: getting the next task for host managed-node1 13271 1727203846.43096: done getting next task for host managed-node1 13271 1727203846.43099: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13271 1727203846.43103: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203846.43123: getting variables 13271 1727203846.43125: in VariableManager get_vars() 13271 1727203846.43160: Calling all_inventory to load vars for managed-node1 13271 1727203846.43162: Calling groups_inventory to load vars for managed-node1 13271 1727203846.43165: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203846.43174: Calling all_plugins_play to load vars for managed-node1 13271 1727203846.43335: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203846.43341: Calling groups_plugins_play to load vars for managed-node1 13271 1727203846.43902: done sending task result for task 028d2410-947f-2a40-12ba-000000000088 13271 1727203846.43906: WORKER PROCESS EXITING 13271 1727203846.44913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203846.46939: done with get_vars() 13271 1727203846.46969: done getting variables 13271 1727203846.47031: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:50:46 -0400 (0:00:00.761) 0:00:30.113 ***** 13271 1727203846.47078: entering _queue_task() for managed-node1/service 13271 1727203846.47403: worker is 1 (out of 1 available) 13271 1727203846.47414: exiting _queue_task() for managed-node1/service 13271 1727203846.47427: done queuing things up, now waiting for results queue to drain 13271 1727203846.47428: waiting for pending results... 13271 1727203846.47712: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 13271 1727203846.47857: in run() - task 028d2410-947f-2a40-12ba-000000000089 13271 1727203846.47885: variable 'ansible_search_path' from source: unknown 13271 1727203846.47895: variable 'ansible_search_path' from source: unknown 13271 1727203846.47938: calling self._execute() 13271 1727203846.48039: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203846.48050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203846.48069: variable 'omit' from source: magic vars 13271 1727203846.48440: variable 'ansible_distribution_major_version' from source: facts 13271 1727203846.48680: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203846.48683: variable 'network_provider' from source: set_fact 13271 1727203846.48686: Evaluated conditional (network_provider == "nm"): True 13271 1727203846.48687: variable '__network_wpa_supplicant_required' from source: role '' defaults 13271 1727203846.48754: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 13271 1727203846.48938: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203846.51357: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203846.51434: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203846.51482: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203846.51528: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203846.51558: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203846.51653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203846.51694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203846.51726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203846.51779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203846.51800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203846.51858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203846.51893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203846.51924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203846.51977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203846.51998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203846.52042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203846.52081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203846.52111: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203846.52155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203846.52186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203846.52333: variable 'network_connections' from source: task vars 13271 1727203846.52351: variable 'port2_profile' from source: play vars 13271 1727203846.52480: variable 'port2_profile' from source: play vars 13271 1727203846.52486: variable 'port1_profile' from source: play vars 13271 1727203846.52510: variable 'port1_profile' from source: play vars 13271 1727203846.52523: variable 'controller_profile' from source: play vars 13271 1727203846.52586: variable 'controller_profile' from source: play vars 13271 1727203846.52668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 13271 1727203846.52868: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 13271 1727203846.52911: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 13271 1727203846.52952: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 13271 1727203846.52990: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 13271 1727203846.53080: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 13271 1727203846.53083: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 13271 1727203846.53086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203846.53117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 13271 1727203846.53177: variable '__network_wireless_connections_defined' from source: role '' defaults 13271 1727203846.53415: variable 'network_connections' from source: task vars 13271 1727203846.53425: variable 'port2_profile' from source: play vars 13271 1727203846.53495: variable 'port2_profile' from source: play vars 13271 1727203846.53507: variable 'port1_profile' from source: play vars 13271 1727203846.53571: variable 'port1_profile' from source: play vars 13271 1727203846.53688: variable 'controller_profile' from source: play vars 13271 1727203846.53691: variable 'controller_profile' from source: play vars 13271 1727203846.53694: Evaluated conditional (__network_wpa_supplicant_required): False 13271 1727203846.53696: when evaluation is False, skipping this task 13271 1727203846.53706: _execute() done 13271 1727203846.53714: dumping result to json 13271 1727203846.53722: done dumping result, returning 13271 1727203846.53734: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [028d2410-947f-2a40-12ba-000000000089] 13271 1727203846.53743: sending task result for task 028d2410-947f-2a40-12ba-000000000089 13271 1727203846.54021: done sending task result for task 028d2410-947f-2a40-12ba-000000000089 13271 1727203846.54024: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 13271 1727203846.54081: no more pending results, returning what we have 13271 1727203846.54084: results queue empty 13271 1727203846.54085: checking for any_errors_fatal 13271 1727203846.54107: done checking for any_errors_fatal 13271 1727203846.54109: checking for max_fail_percentage 13271 1727203846.54111: done checking for max_fail_percentage 13271 1727203846.54112: checking to see if all hosts have failed and the running result is not ok 13271 1727203846.54113: done checking to see if all hosts have failed 13271 1727203846.54113: getting the remaining hosts for this loop 13271 1727203846.54115: done getting the remaining hosts for this loop 13271 1727203846.54119: getting the next task for host managed-node1 13271 1727203846.54127: done getting next task for host managed-node1 13271 1727203846.54130: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 13271 1727203846.54135: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203846.54155: getting variables 13271 1727203846.54157: in VariableManager get_vars() 13271 1727203846.54207: Calling all_inventory to load vars for managed-node1 13271 1727203846.54210: Calling groups_inventory to load vars for managed-node1 13271 1727203846.54212: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203846.54224: Calling all_plugins_play to load vars for managed-node1 13271 1727203846.54227: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203846.54231: Calling groups_plugins_play to load vars for managed-node1 13271 1727203846.55939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203846.57486: done with get_vars() 13271 1727203846.57510: done getting variables 13271 1727203846.57577: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:50:46 -0400 (0:00:00.105) 0:00:30.219 ***** 13271 1727203846.57612: entering _queue_task() for managed-node1/service 13271 1727203846.57938: worker is 1 (out of 1 available) 13271 1727203846.57950: exiting _queue_task() for managed-node1/service 13271 1727203846.57964: done queuing things up, now waiting for results queue to drain 13271 1727203846.57965: waiting for pending results... 13271 1727203846.58246: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 13271 1727203846.58406: in run() - task 028d2410-947f-2a40-12ba-00000000008a 13271 1727203846.58427: variable 'ansible_search_path' from source: unknown 13271 1727203846.58436: variable 'ansible_search_path' from source: unknown 13271 1727203846.58508: calling self._execute() 13271 1727203846.58782: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203846.58786: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203846.58789: variable 'omit' from source: magic vars 13271 1727203846.59076: variable 'ansible_distribution_major_version' from source: facts 13271 1727203846.59097: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203846.59218: variable 'network_provider' from source: set_fact 13271 1727203846.59221: Evaluated conditional (network_provider == "initscripts"): False 13271 1727203846.59226: when evaluation is False, skipping this task 13271 1727203846.59229: _execute() done 13271 1727203846.59232: dumping result to json 13271 1727203846.59234: done dumping result, returning 13271 1727203846.59243: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [028d2410-947f-2a40-12ba-00000000008a] 13271 1727203846.59248: sending task result for task 028d2410-947f-2a40-12ba-00000000008a skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 13271 1727203846.59387: no more pending results, returning what we have 13271 1727203846.59390: results queue empty 13271 1727203846.59391: checking for any_errors_fatal 13271 1727203846.59400: done checking for any_errors_fatal 13271 1727203846.59401: checking for max_fail_percentage 13271 1727203846.59403: done checking for max_fail_percentage 13271 1727203846.59404: checking to see if all hosts have failed and the running result is not ok 13271 1727203846.59405: done checking to see if all hosts have failed 13271 1727203846.59406: getting the remaining hosts for this loop 13271 1727203846.59407: done getting the remaining hosts for this loop 13271 1727203846.59411: getting the next task for host managed-node1 13271 1727203846.59418: done getting next task for host managed-node1 13271 1727203846.59421: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13271 1727203846.59425: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203846.59446: getting variables 13271 1727203846.59447: in VariableManager get_vars() 13271 1727203846.59491: Calling all_inventory to load vars for managed-node1 13271 1727203846.59494: Calling groups_inventory to load vars for managed-node1 13271 1727203846.59496: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203846.59506: Calling all_plugins_play to load vars for managed-node1 13271 1727203846.59509: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203846.59512: Calling groups_plugins_play to load vars for managed-node1 13271 1727203846.60281: done sending task result for task 028d2410-947f-2a40-12ba-00000000008a 13271 1727203846.60285: WORKER PROCESS EXITING 13271 1727203846.61039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203846.63866: done with get_vars() 13271 1727203846.63895: done getting variables 13271 1727203846.63956: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:50:46 -0400 (0:00:00.063) 0:00:30.283 ***** 13271 1727203846.64012: entering _queue_task() for managed-node1/copy 13271 1727203846.64353: worker is 1 (out of 1 available) 13271 1727203846.64364: exiting _queue_task() for managed-node1/copy 13271 1727203846.64380: done queuing things up, now waiting for results queue to drain 13271 1727203846.64381: waiting for pending results... 13271 1727203846.64690: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 13271 1727203846.64832: in run() - task 028d2410-947f-2a40-12ba-00000000008b 13271 1727203846.64846: variable 'ansible_search_path' from source: unknown 13271 1727203846.64849: variable 'ansible_search_path' from source: unknown 13271 1727203846.64896: calling self._execute() 13271 1727203846.64995: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203846.64999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203846.65010: variable 'omit' from source: magic vars 13271 1727203846.65385: variable 'ansible_distribution_major_version' from source: facts 13271 1727203846.65397: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203846.65526: variable 'network_provider' from source: set_fact 13271 1727203846.65530: Evaluated conditional (network_provider == "initscripts"): False 13271 1727203846.65535: when evaluation is False, skipping this task 13271 1727203846.65538: _execute() done 13271 1727203846.65540: dumping result to json 13271 1727203846.65543: done dumping result, returning 13271 1727203846.65552: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [028d2410-947f-2a40-12ba-00000000008b] 13271 1727203846.65557: sending task result for task 028d2410-947f-2a40-12ba-00000000008b 13271 1727203846.65644: done sending task result for task 028d2410-947f-2a40-12ba-00000000008b 13271 1727203846.65650: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13271 1727203846.65726: no more pending results, returning what we have 13271 1727203846.65737: results queue empty 13271 1727203846.65739: checking for any_errors_fatal 13271 1727203846.65744: done checking for any_errors_fatal 13271 1727203846.65745: checking for max_fail_percentage 13271 1727203846.65747: done checking for max_fail_percentage 13271 1727203846.65748: checking to see if all hosts have failed and the running result is not ok 13271 1727203846.65749: done checking to see if all hosts have failed 13271 1727203846.65749: getting the remaining hosts for this loop 13271 1727203846.65751: done getting the remaining hosts for this loop 13271 1727203846.65754: getting the next task for host managed-node1 13271 1727203846.65763: done getting next task for host managed-node1 13271 1727203846.65766: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13271 1727203846.65771: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203846.65791: getting variables 13271 1727203846.65794: in VariableManager get_vars() 13271 1727203846.65835: Calling all_inventory to load vars for managed-node1 13271 1727203846.66205: Calling groups_inventory to load vars for managed-node1 13271 1727203846.66209: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203846.66222: Calling all_plugins_play to load vars for managed-node1 13271 1727203846.66226: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203846.66229: Calling groups_plugins_play to load vars for managed-node1 13271 1727203846.69193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203846.71044: done with get_vars() 13271 1727203846.71080: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:50:46 -0400 (0:00:00.071) 0:00:30.354 ***** 13271 1727203846.71181: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 13271 1727203846.71603: worker is 1 (out of 1 available) 13271 1727203846.71766: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 13271 1727203846.71778: done queuing things up, now waiting for results queue to drain 13271 1727203846.71780: waiting for pending results... 13271 1727203846.72094: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 13271 1727203846.72277: in run() - task 028d2410-947f-2a40-12ba-00000000008c 13271 1727203846.72295: variable 'ansible_search_path' from source: unknown 13271 1727203846.72298: variable 'ansible_search_path' from source: unknown 13271 1727203846.72333: calling self._execute() 13271 1727203846.72582: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203846.72586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203846.72588: variable 'omit' from source: magic vars 13271 1727203846.72903: variable 'ansible_distribution_major_version' from source: facts 13271 1727203846.72921: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203846.72928: variable 'omit' from source: magic vars 13271 1727203846.73003: variable 'omit' from source: magic vars 13271 1727203846.73381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 13271 1727203846.75624: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 13271 1727203846.75742: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 13271 1727203846.75933: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 13271 1727203846.75971: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 13271 1727203846.76078: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 13271 1727203846.76269: variable 'network_provider' from source: set_fact 13271 1727203846.76542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 13271 1727203846.76693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 13271 1727203846.76718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 13271 1727203846.76760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 13271 1727203846.76783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 13271 1727203846.76855: variable 'omit' from source: magic vars 13271 1727203846.77182: variable 'omit' from source: magic vars 13271 1727203846.77449: variable 'network_connections' from source: task vars 13271 1727203846.77461: variable 'port2_profile' from source: play vars 13271 1727203846.77520: variable 'port2_profile' from source: play vars 13271 1727203846.77528: variable 'port1_profile' from source: play vars 13271 1727203846.77718: variable 'port1_profile' from source: play vars 13271 1727203846.77726: variable 'controller_profile' from source: play vars 13271 1727203846.77917: variable 'controller_profile' from source: play vars 13271 1727203846.78164: variable 'omit' from source: magic vars 13271 1727203846.78167: variable '__lsr_ansible_managed' from source: task vars 13271 1727203846.78176: variable '__lsr_ansible_managed' from source: task vars 13271 1727203846.78371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 13271 1727203846.79080: Loaded config def from plugin (lookup/template) 13271 1727203846.79084: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 13271 1727203846.79086: File lookup term: get_ansible_managed.j2 13271 1727203846.79088: variable 'ansible_search_path' from source: unknown 13271 1727203846.79091: evaluation_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 13271 1727203846.79095: search_path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 13271 1727203846.79098: variable 'ansible_search_path' from source: unknown 13271 1727203846.87028: variable 'ansible_managed' from source: unknown 13271 1727203846.87191: variable 'omit' from source: magic vars 13271 1727203846.87226: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203846.87257: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203846.87286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203846.87381: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203846.87384: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203846.87387: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203846.87389: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203846.87391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203846.87468: Set connection var ansible_connection to ssh 13271 1727203846.87483: Set connection var ansible_shell_type to sh 13271 1727203846.87496: Set connection var ansible_timeout to 10 13271 1727203846.87507: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203846.87520: Set connection var ansible_pipelining to False 13271 1727203846.87530: Set connection var ansible_shell_executable to /bin/sh 13271 1727203846.87559: variable 'ansible_shell_executable' from source: unknown 13271 1727203846.87568: variable 'ansible_connection' from source: unknown 13271 1727203846.87577: variable 'ansible_module_compression' from source: unknown 13271 1727203846.87584: variable 'ansible_shell_type' from source: unknown 13271 1727203846.87590: variable 'ansible_shell_executable' from source: unknown 13271 1727203846.87597: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203846.87780: variable 'ansible_pipelining' from source: unknown 13271 1727203846.87783: variable 'ansible_timeout' from source: unknown 13271 1727203846.87794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203846.87796: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203846.87799: variable 'omit' from source: magic vars 13271 1727203846.87801: starting attempt loop 13271 1727203846.87803: running the handler 13271 1727203846.87805: _low_level_execute_command(): starting 13271 1727203846.87807: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203846.88472: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203846.88586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203846.88614: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203846.88720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203846.90554: stdout chunk (state=3): >>>/root <<< 13271 1727203846.90654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203846.90706: stderr chunk (state=3): >>><<< 13271 1727203846.90715: stdout chunk (state=3): >>><<< 13271 1727203846.90738: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203846.90764: _low_level_execute_command(): starting 13271 1727203846.90774: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479 `" && echo ansible-tmp-1727203846.907523-15249-174267124054479="` echo /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479 `" ) && sleep 0' 13271 1727203846.91428: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203846.91441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203846.91458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203846.91480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203846.91498: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203846.91524: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203846.91592: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203846.91637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203846.91655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203846.91682: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203846.91800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203846.93926: stdout chunk (state=3): >>>ansible-tmp-1727203846.907523-15249-174267124054479=/root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479 <<< 13271 1727203846.94086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203846.94097: stdout chunk (state=3): >>><<< 13271 1727203846.94112: stderr chunk (state=3): >>><<< 13271 1727203846.94280: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203846.907523-15249-174267124054479=/root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203846.94284: variable 'ansible_module_compression' from source: unknown 13271 1727203846.94286: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 13271 1727203846.94305: variable 'ansible_facts' from source: unknown 13271 1727203846.94651: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/AnsiballZ_network_connections.py 13271 1727203846.94804: Sending initial data 13271 1727203846.94807: Sent initial data (167 bytes) 13271 1727203846.95460: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203846.95533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203846.95537: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203846.95617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203846.97391: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203846.97478: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203846.97566: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpp3kc1af_ /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/AnsiballZ_network_connections.py <<< 13271 1727203846.97569: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/AnsiballZ_network_connections.py" <<< 13271 1727203846.97656: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpp3kc1af_" to remote "/root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/AnsiballZ_network_connections.py" <<< 13271 1727203846.98989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203846.99047: stderr chunk (state=3): >>><<< 13271 1727203846.99057: stdout chunk (state=3): >>><<< 13271 1727203846.99208: done transferring module to remote 13271 1727203846.99212: _low_level_execute_command(): starting 13271 1727203846.99215: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/ /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/AnsiballZ_network_connections.py && sleep 0' 13271 1727203846.99807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203846.99897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203846.99942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203846.99969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203847.00022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203847.00095: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203847.02180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203847.02219: stdout chunk (state=3): >>><<< 13271 1727203847.02222: stderr chunk (state=3): >>><<< 13271 1727203847.02239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203847.02248: _low_level_execute_command(): starting 13271 1727203847.02281: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/AnsiballZ_network_connections.py && sleep 0' 13271 1727203847.02911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203847.02936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203847.02949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203847.03047: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203847.03068: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203847.03091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203847.03286: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203847.58896: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 13271 1727203847.59046: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/f89efd2b-3089-4f6f-a8eb-220a10d50c35: error=unknown <<< 13271 1727203847.61007: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/7bd864cb-338e-4797-a338-3ce206f3a7c2: error=unknown <<< 13271 1727203847.63109: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/37efd94b-f2e9-48ef-ae23-b04b9f9540cf: error=unknown <<< 13271 1727203847.63293: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 13271 1727203847.65861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203847.65865: stdout chunk (state=3): >>><<< 13271 1727203847.65868: stderr chunk (state=3): >>><<< 13271 1727203847.65870: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/f89efd2b-3089-4f6f-a8eb-220a10d50c35: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/7bd864cb-338e-4797-a338-3ce206f3a7c2: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_h0tmu23h/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/37efd94b-f2e9-48ef-ae23-b04b9f9540cf: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203847.65881: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203847.65884: _low_level_execute_command(): starting 13271 1727203847.65886: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203846.907523-15249-174267124054479/ > /dev/null 2>&1 && sleep 0' 13271 1727203847.66942: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203847.67070: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203847.67243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203847.67267: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203847.67444: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203847.69507: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203847.69550: stderr chunk (state=3): >>><<< 13271 1727203847.69554: stdout chunk (state=3): >>><<< 13271 1727203847.69589: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203847.69594: handler run complete 13271 1727203847.69627: attempt loop complete, returning result 13271 1727203847.69631: _execute() done 13271 1727203847.69633: dumping result to json 13271 1727203847.69639: done dumping result, returning 13271 1727203847.69650: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [028d2410-947f-2a40-12ba-00000000008c] 13271 1727203847.69652: sending task result for task 028d2410-947f-2a40-12ba-00000000008c 13271 1727203847.69786: done sending task result for task 028d2410-947f-2a40-12ba-00000000008c 13271 1727203847.69789: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 13271 1727203847.70108: no more pending results, returning what we have 13271 1727203847.70111: results queue empty 13271 1727203847.70112: checking for any_errors_fatal 13271 1727203847.70118: done checking for any_errors_fatal 13271 1727203847.70119: checking for max_fail_percentage 13271 1727203847.70120: done checking for max_fail_percentage 13271 1727203847.70121: checking to see if all hosts have failed and the running result is not ok 13271 1727203847.70122: done checking to see if all hosts have failed 13271 1727203847.70123: getting the remaining hosts for this loop 13271 1727203847.70124: done getting the remaining hosts for this loop 13271 1727203847.70128: getting the next task for host managed-node1 13271 1727203847.70134: done getting next task for host managed-node1 13271 1727203847.70138: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 13271 1727203847.70141: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203847.70154: getting variables 13271 1727203847.70155: in VariableManager get_vars() 13271 1727203847.70238: Calling all_inventory to load vars for managed-node1 13271 1727203847.70241: Calling groups_inventory to load vars for managed-node1 13271 1727203847.70244: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203847.70262: Calling all_plugins_play to load vars for managed-node1 13271 1727203847.70266: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203847.70270: Calling groups_plugins_play to load vars for managed-node1 13271 1727203847.73311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203847.76034: done with get_vars() 13271 1727203847.76073: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:50:47 -0400 (0:00:01.049) 0:00:31.404 ***** 13271 1727203847.76164: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 13271 1727203847.76905: worker is 1 (out of 1 available) 13271 1727203847.76915: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 13271 1727203847.76924: done queuing things up, now waiting for results queue to drain 13271 1727203847.76926: waiting for pending results... 13271 1727203847.77279: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 13271 1727203847.77331: in run() - task 028d2410-947f-2a40-12ba-00000000008d 13271 1727203847.77352: variable 'ansible_search_path' from source: unknown 13271 1727203847.77368: variable 'ansible_search_path' from source: unknown 13271 1727203847.77414: calling self._execute() 13271 1727203847.77520: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203847.77582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203847.77590: variable 'omit' from source: magic vars 13271 1727203847.77966: variable 'ansible_distribution_major_version' from source: facts 13271 1727203847.77985: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203847.78112: variable 'network_state' from source: role '' defaults 13271 1727203847.78135: Evaluated conditional (network_state != {}): False 13271 1727203847.78142: when evaluation is False, skipping this task 13271 1727203847.78148: _execute() done 13271 1727203847.78154: dumping result to json 13271 1727203847.78164: done dumping result, returning 13271 1727203847.78235: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [028d2410-947f-2a40-12ba-00000000008d] 13271 1727203847.78241: sending task result for task 028d2410-947f-2a40-12ba-00000000008d 13271 1727203847.78310: done sending task result for task 028d2410-947f-2a40-12ba-00000000008d 13271 1727203847.78313: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 13271 1727203847.78403: no more pending results, returning what we have 13271 1727203847.78407: results queue empty 13271 1727203847.78408: checking for any_errors_fatal 13271 1727203847.78423: done checking for any_errors_fatal 13271 1727203847.78424: checking for max_fail_percentage 13271 1727203847.78426: done checking for max_fail_percentage 13271 1727203847.78427: checking to see if all hosts have failed and the running result is not ok 13271 1727203847.78428: done checking to see if all hosts have failed 13271 1727203847.78428: getting the remaining hosts for this loop 13271 1727203847.78430: done getting the remaining hosts for this loop 13271 1727203847.78433: getting the next task for host managed-node1 13271 1727203847.78441: done getting next task for host managed-node1 13271 1727203847.78481: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13271 1727203847.78486: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203847.78508: getting variables 13271 1727203847.78510: in VariableManager get_vars() 13271 1727203847.78667: Calling all_inventory to load vars for managed-node1 13271 1727203847.78671: Calling groups_inventory to load vars for managed-node1 13271 1727203847.78675: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203847.78691: Calling all_plugins_play to load vars for managed-node1 13271 1727203847.78695: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203847.78698: Calling groups_plugins_play to load vars for managed-node1 13271 1727203847.81127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203847.83596: done with get_vars() 13271 1727203847.83623: done getting variables 13271 1727203847.83687: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:50:47 -0400 (0:00:00.075) 0:00:31.480 ***** 13271 1727203847.83720: entering _queue_task() for managed-node1/debug 13271 1727203847.84354: worker is 1 (out of 1 available) 13271 1727203847.84366: exiting _queue_task() for managed-node1/debug 13271 1727203847.84380: done queuing things up, now waiting for results queue to drain 13271 1727203847.84382: waiting for pending results... 13271 1727203847.84685: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 13271 1727203847.84781: in run() - task 028d2410-947f-2a40-12ba-00000000008e 13271 1727203847.84786: variable 'ansible_search_path' from source: unknown 13271 1727203847.84789: variable 'ansible_search_path' from source: unknown 13271 1727203847.84937: calling self._execute() 13271 1727203847.84941: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203847.84948: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203847.84960: variable 'omit' from source: magic vars 13271 1727203847.85355: variable 'ansible_distribution_major_version' from source: facts 13271 1727203847.85370: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203847.85386: variable 'omit' from source: magic vars 13271 1727203847.85457: variable 'omit' from source: magic vars 13271 1727203847.85507: variable 'omit' from source: magic vars 13271 1727203847.85546: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203847.85586: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203847.85614: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203847.85633: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203847.85645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203847.85679: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203847.85682: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203847.85782: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203847.85798: Set connection var ansible_connection to ssh 13271 1727203847.85811: Set connection var ansible_shell_type to sh 13271 1727203847.85819: Set connection var ansible_timeout to 10 13271 1727203847.85825: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203847.85831: Set connection var ansible_pipelining to False 13271 1727203847.85836: Set connection var ansible_shell_executable to /bin/sh 13271 1727203847.85864: variable 'ansible_shell_executable' from source: unknown 13271 1727203847.85868: variable 'ansible_connection' from source: unknown 13271 1727203847.85871: variable 'ansible_module_compression' from source: unknown 13271 1727203847.85873: variable 'ansible_shell_type' from source: unknown 13271 1727203847.85877: variable 'ansible_shell_executable' from source: unknown 13271 1727203847.85880: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203847.85882: variable 'ansible_pipelining' from source: unknown 13271 1727203847.85884: variable 'ansible_timeout' from source: unknown 13271 1727203847.85886: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203847.86087: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203847.86098: variable 'omit' from source: magic vars 13271 1727203847.86103: starting attempt loop 13271 1727203847.86106: running the handler 13271 1727203847.86467: variable '__network_connections_result' from source: set_fact 13271 1727203847.86669: handler run complete 13271 1727203847.86673: attempt loop complete, returning result 13271 1727203847.86679: _execute() done 13271 1727203847.86682: dumping result to json 13271 1727203847.86684: done dumping result, returning 13271 1727203847.86686: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [028d2410-947f-2a40-12ba-00000000008e] 13271 1727203847.86689: sending task result for task 028d2410-947f-2a40-12ba-00000000008e 13271 1727203847.86858: done sending task result for task 028d2410-947f-2a40-12ba-00000000008e 13271 1727203847.86865: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 13271 1727203847.86959: no more pending results, returning what we have 13271 1727203847.86962: results queue empty 13271 1727203847.86963: checking for any_errors_fatal 13271 1727203847.86969: done checking for any_errors_fatal 13271 1727203847.86970: checking for max_fail_percentage 13271 1727203847.86972: done checking for max_fail_percentage 13271 1727203847.86973: checking to see if all hosts have failed and the running result is not ok 13271 1727203847.86974: done checking to see if all hosts have failed 13271 1727203847.86974: getting the remaining hosts for this loop 13271 1727203847.86978: done getting the remaining hosts for this loop 13271 1727203847.86982: getting the next task for host managed-node1 13271 1727203847.86989: done getting next task for host managed-node1 13271 1727203847.86993: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13271 1727203847.86997: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203847.87009: getting variables 13271 1727203847.87011: in VariableManager get_vars() 13271 1727203847.87050: Calling all_inventory to load vars for managed-node1 13271 1727203847.87052: Calling groups_inventory to load vars for managed-node1 13271 1727203847.87055: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203847.87066: Calling all_plugins_play to load vars for managed-node1 13271 1727203847.87069: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203847.87072: Calling groups_plugins_play to load vars for managed-node1 13271 1727203847.89038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203847.91699: done with get_vars() 13271 1727203847.91737: done getting variables 13271 1727203847.91823: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:50:47 -0400 (0:00:00.082) 0:00:31.562 ***** 13271 1727203847.91970: entering _queue_task() for managed-node1/debug 13271 1727203847.92520: worker is 1 (out of 1 available) 13271 1727203847.92532: exiting _queue_task() for managed-node1/debug 13271 1727203847.92544: done queuing things up, now waiting for results queue to drain 13271 1727203847.92545: waiting for pending results... 13271 1727203847.93119: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 13271 1727203847.93333: in run() - task 028d2410-947f-2a40-12ba-00000000008f 13271 1727203847.93348: variable 'ansible_search_path' from source: unknown 13271 1727203847.93352: variable 'ansible_search_path' from source: unknown 13271 1727203847.93405: calling self._execute() 13271 1727203847.93546: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203847.93550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203847.93554: variable 'omit' from source: magic vars 13271 1727203847.93883: variable 'ansible_distribution_major_version' from source: facts 13271 1727203847.93895: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203847.93902: variable 'omit' from source: magic vars 13271 1727203847.93966: variable 'omit' from source: magic vars 13271 1727203847.94007: variable 'omit' from source: magic vars 13271 1727203847.94043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203847.94076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203847.94108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203847.94201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203847.94204: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203847.94207: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203847.94210: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203847.94217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203847.94259: Set connection var ansible_connection to ssh 13271 1727203847.94265: Set connection var ansible_shell_type to sh 13271 1727203847.94274: Set connection var ansible_timeout to 10 13271 1727203847.94281: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203847.94287: Set connection var ansible_pipelining to False 13271 1727203847.94292: Set connection var ansible_shell_executable to /bin/sh 13271 1727203847.94323: variable 'ansible_shell_executable' from source: unknown 13271 1727203847.94329: variable 'ansible_connection' from source: unknown 13271 1727203847.94332: variable 'ansible_module_compression' from source: unknown 13271 1727203847.94334: variable 'ansible_shell_type' from source: unknown 13271 1727203847.94337: variable 'ansible_shell_executable' from source: unknown 13271 1727203847.94339: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203847.94341: variable 'ansible_pipelining' from source: unknown 13271 1727203847.94343: variable 'ansible_timeout' from source: unknown 13271 1727203847.94345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203847.94525: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203847.94529: variable 'omit' from source: magic vars 13271 1727203847.94531: starting attempt loop 13271 1727203847.94534: running the handler 13271 1727203847.94548: variable '__network_connections_result' from source: set_fact 13271 1727203847.94618: variable '__network_connections_result' from source: set_fact 13271 1727203847.94747: handler run complete 13271 1727203847.94773: attempt loop complete, returning result 13271 1727203847.94849: _execute() done 13271 1727203847.94853: dumping result to json 13271 1727203847.94855: done dumping result, returning 13271 1727203847.94857: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [028d2410-947f-2a40-12ba-00000000008f] 13271 1727203847.94859: sending task result for task 028d2410-947f-2a40-12ba-00000000008f ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 13271 1727203847.95050: no more pending results, returning what we have 13271 1727203847.95054: results queue empty 13271 1727203847.95055: checking for any_errors_fatal 13271 1727203847.95059: done checking for any_errors_fatal 13271 1727203847.95177: checking for max_fail_percentage 13271 1727203847.95180: done checking for max_fail_percentage 13271 1727203847.95181: checking to see if all hosts have failed and the running result is not ok 13271 1727203847.95183: done checking to see if all hosts have failed 13271 1727203847.95183: getting the remaining hosts for this loop 13271 1727203847.95184: done getting the remaining hosts for this loop 13271 1727203847.95188: getting the next task for host managed-node1 13271 1727203847.95193: done getting next task for host managed-node1 13271 1727203847.95197: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13271 1727203847.95201: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203847.95213: getting variables 13271 1727203847.95214: in VariableManager get_vars() 13271 1727203847.95250: Calling all_inventory to load vars for managed-node1 13271 1727203847.95252: Calling groups_inventory to load vars for managed-node1 13271 1727203847.95254: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203847.95267: Calling all_plugins_play to load vars for managed-node1 13271 1727203847.95270: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203847.95274: Calling groups_plugins_play to load vars for managed-node1 13271 1727203847.95407: done sending task result for task 028d2410-947f-2a40-12ba-00000000008f 13271 1727203847.95424: WORKER PROCESS EXITING 13271 1727203847.97507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203847.99127: done with get_vars() 13271 1727203847.99157: done getting variables 13271 1727203847.99226: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:50:47 -0400 (0:00:00.072) 0:00:31.635 ***** 13271 1727203847.99266: entering _queue_task() for managed-node1/debug 13271 1727203847.99664: worker is 1 (out of 1 available) 13271 1727203847.99808: exiting _queue_task() for managed-node1/debug 13271 1727203847.99818: done queuing things up, now waiting for results queue to drain 13271 1727203847.99820: waiting for pending results... 13271 1727203847.99998: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 13271 1727203848.00144: in run() - task 028d2410-947f-2a40-12ba-000000000090 13271 1727203848.00168: variable 'ansible_search_path' from source: unknown 13271 1727203848.00175: variable 'ansible_search_path' from source: unknown 13271 1727203848.00213: calling self._execute() 13271 1727203848.00317: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203848.00327: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203848.00339: variable 'omit' from source: magic vars 13271 1727203848.00726: variable 'ansible_distribution_major_version' from source: facts 13271 1727203848.00743: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203848.01029: variable 'network_state' from source: role '' defaults 13271 1727203848.01064: Evaluated conditional (network_state != {}): False 13271 1727203848.01159: when evaluation is False, skipping this task 13271 1727203848.01165: _execute() done 13271 1727203848.01167: dumping result to json 13271 1727203848.01169: done dumping result, returning 13271 1727203848.01172: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [028d2410-947f-2a40-12ba-000000000090] 13271 1727203848.01174: sending task result for task 028d2410-947f-2a40-12ba-000000000090 skipping: [managed-node1] => { "false_condition": "network_state != {}" } 13271 1727203848.01457: no more pending results, returning what we have 13271 1727203848.01464: results queue empty 13271 1727203848.01465: checking for any_errors_fatal 13271 1727203848.01474: done checking for any_errors_fatal 13271 1727203848.01477: checking for max_fail_percentage 13271 1727203848.01479: done checking for max_fail_percentage 13271 1727203848.01480: checking to see if all hosts have failed and the running result is not ok 13271 1727203848.01481: done checking to see if all hosts have failed 13271 1727203848.01482: getting the remaining hosts for this loop 13271 1727203848.01483: done getting the remaining hosts for this loop 13271 1727203848.01600: getting the next task for host managed-node1 13271 1727203848.01610: done getting next task for host managed-node1 13271 1727203848.01614: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 13271 1727203848.01620: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203848.01641: getting variables 13271 1727203848.01643: in VariableManager get_vars() 13271 1727203848.01719: Calling all_inventory to load vars for managed-node1 13271 1727203848.01722: Calling groups_inventory to load vars for managed-node1 13271 1727203848.01724: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203848.01786: Calling all_plugins_play to load vars for managed-node1 13271 1727203848.01790: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203848.01793: Calling groups_plugins_play to load vars for managed-node1 13271 1727203848.02471: done sending task result for task 028d2410-947f-2a40-12ba-000000000090 13271 1727203848.02474: WORKER PROCESS EXITING 13271 1727203848.09283: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203848.10780: done with get_vars() 13271 1727203848.10805: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:50:48 -0400 (0:00:00.116) 0:00:31.752 ***** 13271 1727203848.10886: entering _queue_task() for managed-node1/ping 13271 1727203848.11224: worker is 1 (out of 1 available) 13271 1727203848.11237: exiting _queue_task() for managed-node1/ping 13271 1727203848.11248: done queuing things up, now waiting for results queue to drain 13271 1727203848.11249: waiting for pending results... 13271 1727203848.11795: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 13271 1727203848.12206: in run() - task 028d2410-947f-2a40-12ba-000000000091 13271 1727203848.12293: variable 'ansible_search_path' from source: unknown 13271 1727203848.12302: variable 'ansible_search_path' from source: unknown 13271 1727203848.12572: calling self._execute() 13271 1727203848.12578: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203848.12696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203848.12711: variable 'omit' from source: magic vars 13271 1727203848.13541: variable 'ansible_distribution_major_version' from source: facts 13271 1727203848.13677: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203848.13692: variable 'omit' from source: magic vars 13271 1727203848.13835: variable 'omit' from source: magic vars 13271 1727203848.14082: variable 'omit' from source: magic vars 13271 1727203848.14086: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203848.14089: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203848.14188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203848.14220: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203848.14238: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203848.14279: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203848.14485: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203848.14488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203848.14614: Set connection var ansible_connection to ssh 13271 1727203848.14684: Set connection var ansible_shell_type to sh 13271 1727203848.14710: Set connection var ansible_timeout to 10 13271 1727203848.14810: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203848.14813: Set connection var ansible_pipelining to False 13271 1727203848.14816: Set connection var ansible_shell_executable to /bin/sh 13271 1727203848.14884: variable 'ansible_shell_executable' from source: unknown 13271 1727203848.14893: variable 'ansible_connection' from source: unknown 13271 1727203848.14901: variable 'ansible_module_compression' from source: unknown 13271 1727203848.14908: variable 'ansible_shell_type' from source: unknown 13271 1727203848.14921: variable 'ansible_shell_executable' from source: unknown 13271 1727203848.14928: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203848.14937: variable 'ansible_pipelining' from source: unknown 13271 1727203848.15027: variable 'ansible_timeout' from source: unknown 13271 1727203848.15030: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203848.15792: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 13271 1727203848.15796: variable 'omit' from source: magic vars 13271 1727203848.15798: starting attempt loop 13271 1727203848.15800: running the handler 13271 1727203848.15802: _low_level_execute_command(): starting 13271 1727203848.15804: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203848.17329: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203848.17496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203848.17548: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.17894: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.18092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.20084: stdout chunk (state=3): >>>/root <<< 13271 1727203848.20088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.20091: stdout chunk (state=3): >>><<< 13271 1727203848.20093: stderr chunk (state=3): >>><<< 13271 1727203848.20097: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203848.20099: _low_level_execute_command(): starting 13271 1727203848.20101: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800 `" && echo ansible-tmp-1727203848.2004526-15333-54102863466800="` echo /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800 `" ) && sleep 0' 13271 1727203848.21337: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203848.21341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.21347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.21416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203848.21485: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.21554: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.21592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.23788: stdout chunk (state=3): >>>ansible-tmp-1727203848.2004526-15333-54102863466800=/root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800 <<< 13271 1727203848.23993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.24264: stderr chunk (state=3): >>><<< 13271 1727203848.24268: stdout chunk (state=3): >>><<< 13271 1727203848.24271: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203848.2004526-15333-54102863466800=/root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203848.24273: variable 'ansible_module_compression' from source: unknown 13271 1727203848.24387: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 13271 1727203848.24465: variable 'ansible_facts' from source: unknown 13271 1727203848.24693: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/AnsiballZ_ping.py 13271 1727203848.25010: Sending initial data 13271 1727203848.25019: Sent initial data (152 bytes) 13271 1727203848.26395: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.26466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203848.26487: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.26502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.26610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.28374: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203848.28444: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203848.28530: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpm_alfdvq /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/AnsiballZ_ping.py <<< 13271 1727203848.28534: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/AnsiballZ_ping.py" <<< 13271 1727203848.28606: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpm_alfdvq" to remote "/root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/AnsiballZ_ping.py" <<< 13271 1727203848.30238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.30300: stderr chunk (state=3): >>><<< 13271 1727203848.30344: stdout chunk (state=3): >>><<< 13271 1727203848.30370: done transferring module to remote 13271 1727203848.30395: _low_level_execute_command(): starting 13271 1727203848.30436: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/ /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/AnsiballZ_ping.py && sleep 0' 13271 1727203848.32171: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203848.32239: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.32314: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203848.32362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.32421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.32827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.34683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.34687: stdout chunk (state=3): >>><<< 13271 1727203848.34689: stderr chunk (state=3): >>><<< 13271 1727203848.34699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203848.34702: _low_level_execute_command(): starting 13271 1727203848.34764: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/AnsiballZ_ping.py && sleep 0' 13271 1727203848.36097: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203848.36112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203848.36284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.36797: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.36918: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.53390: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 13271 1727203848.55010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203848.55015: stdout chunk (state=3): >>><<< 13271 1727203848.55018: stderr chunk (state=3): >>><<< 13271 1727203848.55268: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203848.55272: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203848.55278: _low_level_execute_command(): starting 13271 1727203848.55281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203848.2004526-15333-54102863466800/ > /dev/null 2>&1 && sleep 0' 13271 1727203848.56250: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203848.56274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203848.56298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203848.56330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203848.56349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203848.56387: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.56459: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203848.56497: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.56501: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.56584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.58847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.58871: stderr chunk (state=3): >>><<< 13271 1727203848.58877: stdout chunk (state=3): >>><<< 13271 1727203848.58896: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203848.58981: handler run complete 13271 1727203848.58985: attempt loop complete, returning result 13271 1727203848.58987: _execute() done 13271 1727203848.58990: dumping result to json 13271 1727203848.58992: done dumping result, returning 13271 1727203848.59173: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [028d2410-947f-2a40-12ba-000000000091] 13271 1727203848.59180: sending task result for task 028d2410-947f-2a40-12ba-000000000091 13271 1727203848.59254: done sending task result for task 028d2410-947f-2a40-12ba-000000000091 13271 1727203848.59257: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 13271 1727203848.59347: no more pending results, returning what we have 13271 1727203848.59350: results queue empty 13271 1727203848.59352: checking for any_errors_fatal 13271 1727203848.59361: done checking for any_errors_fatal 13271 1727203848.59362: checking for max_fail_percentage 13271 1727203848.59363: done checking for max_fail_percentage 13271 1727203848.59365: checking to see if all hosts have failed and the running result is not ok 13271 1727203848.59366: done checking to see if all hosts have failed 13271 1727203848.59366: getting the remaining hosts for this loop 13271 1727203848.59368: done getting the remaining hosts for this loop 13271 1727203848.59371: getting the next task for host managed-node1 13271 1727203848.59385: done getting next task for host managed-node1 13271 1727203848.59388: ^ task is: TASK: meta (role_complete) 13271 1727203848.59392: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203848.59407: getting variables 13271 1727203848.59409: in VariableManager get_vars() 13271 1727203848.59457: Calling all_inventory to load vars for managed-node1 13271 1727203848.59460: Calling groups_inventory to load vars for managed-node1 13271 1727203848.59462: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203848.59474: Calling all_plugins_play to load vars for managed-node1 13271 1727203848.59723: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203848.59728: Calling groups_plugins_play to load vars for managed-node1 13271 1727203848.62104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203848.63949: done with get_vars() 13271 1727203848.63970: done getting variables 13271 1727203848.64059: done queuing things up, now waiting for results queue to drain 13271 1727203848.64061: results queue empty 13271 1727203848.64062: checking for any_errors_fatal 13271 1727203848.64065: done checking for any_errors_fatal 13271 1727203848.64066: checking for max_fail_percentage 13271 1727203848.64067: done checking for max_fail_percentage 13271 1727203848.64067: checking to see if all hosts have failed and the running result is not ok 13271 1727203848.64068: done checking to see if all hosts have failed 13271 1727203848.64069: getting the remaining hosts for this loop 13271 1727203848.64070: done getting the remaining hosts for this loop 13271 1727203848.64073: getting the next task for host managed-node1 13271 1727203848.64079: done getting next task for host managed-node1 13271 1727203848.64081: ^ task is: TASK: Delete the device '{{ controller_device }}' 13271 1727203848.64083: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203848.64086: getting variables 13271 1727203848.64087: in VariableManager get_vars() 13271 1727203848.64108: Calling all_inventory to load vars for managed-node1 13271 1727203848.64110: Calling groups_inventory to load vars for managed-node1 13271 1727203848.64112: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203848.64118: Calling all_plugins_play to load vars for managed-node1 13271 1727203848.64120: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203848.64122: Calling groups_plugins_play to load vars for managed-node1 13271 1727203848.66124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203848.67800: done with get_vars() 13271 1727203848.67829: done getting variables 13271 1727203848.67883: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 13271 1727203848.68019: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Tuesday 24 September 2024 14:50:48 -0400 (0:00:00.571) 0:00:32.323 ***** 13271 1727203848.68052: entering _queue_task() for managed-node1/command 13271 1727203848.68489: worker is 1 (out of 1 available) 13271 1727203848.68504: exiting _queue_task() for managed-node1/command 13271 1727203848.68518: done queuing things up, now waiting for results queue to drain 13271 1727203848.68519: waiting for pending results... 13271 1727203848.69233: running TaskExecutor() for managed-node1/TASK: Delete the device 'nm-bond' 13271 1727203848.69329: in run() - task 028d2410-947f-2a40-12ba-0000000000c1 13271 1727203848.69334: variable 'ansible_search_path' from source: unknown 13271 1727203848.69520: calling self._execute() 13271 1727203848.69623: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203848.69630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203848.69642: variable 'omit' from source: magic vars 13271 1727203848.70259: variable 'ansible_distribution_major_version' from source: facts 13271 1727203848.70274: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203848.70281: variable 'omit' from source: magic vars 13271 1727203848.70307: variable 'omit' from source: magic vars 13271 1727203848.70414: variable 'controller_device' from source: play vars 13271 1727203848.70429: variable 'omit' from source: magic vars 13271 1727203848.70481: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203848.70520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203848.70538: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203848.70556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203848.70581: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203848.70634: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203848.70637: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203848.70640: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203848.70718: Set connection var ansible_connection to ssh 13271 1727203848.70741: Set connection var ansible_shell_type to sh 13271 1727203848.70744: Set connection var ansible_timeout to 10 13271 1727203848.70746: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203848.70748: Set connection var ansible_pipelining to False 13271 1727203848.70750: Set connection var ansible_shell_executable to /bin/sh 13271 1727203848.70782: variable 'ansible_shell_executable' from source: unknown 13271 1727203848.70786: variable 'ansible_connection' from source: unknown 13271 1727203848.70789: variable 'ansible_module_compression' from source: unknown 13271 1727203848.70791: variable 'ansible_shell_type' from source: unknown 13271 1727203848.70793: variable 'ansible_shell_executable' from source: unknown 13271 1727203848.70796: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203848.70798: variable 'ansible_pipelining' from source: unknown 13271 1727203848.70800: variable 'ansible_timeout' from source: unknown 13271 1727203848.70802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203848.70960: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203848.70965: variable 'omit' from source: magic vars 13271 1727203848.70968: starting attempt loop 13271 1727203848.70971: running the handler 13271 1727203848.71069: _low_level_execute_command(): starting 13271 1727203848.71072: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203848.71807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203848.71974: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.72091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.72207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.73984: stdout chunk (state=3): >>>/root <<< 13271 1727203848.74182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.74185: stderr chunk (state=3): >>><<< 13271 1727203848.74188: stdout chunk (state=3): >>><<< 13271 1727203848.74191: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203848.74194: _low_level_execute_command(): starting 13271 1727203848.74331: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872 `" && echo ansible-tmp-1727203848.741822-15418-256129017413872="` echo /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872 `" ) && sleep 0' 13271 1727203848.74905: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203848.74929: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203848.74947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203848.74968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203848.74995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203848.75049: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.75122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203848.75163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.75178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.75295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.77420: stdout chunk (state=3): >>>ansible-tmp-1727203848.741822-15418-256129017413872=/root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872 <<< 13271 1727203848.77885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.77889: stdout chunk (state=3): >>><<< 13271 1727203848.77892: stderr chunk (state=3): >>><<< 13271 1727203848.77895: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203848.741822-15418-256129017413872=/root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203848.77898: variable 'ansible_module_compression' from source: unknown 13271 1727203848.77900: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203848.77930: variable 'ansible_facts' from source: unknown 13271 1727203848.78010: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/AnsiballZ_command.py 13271 1727203848.78198: Sending initial data 13271 1727203848.78207: Sent initial data (155 bytes) 13271 1727203848.78721: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203848.78730: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203848.78740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203848.78753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203848.78766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203848.78771: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203848.78853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203848.78865: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.78955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.79096: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.80833: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203848.80917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203848.81005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp6y8vuv79 /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/AnsiballZ_command.py <<< 13271 1727203848.81018: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/AnsiballZ_command.py" <<< 13271 1727203848.81091: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp6y8vuv79" to remote "/root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/AnsiballZ_command.py" <<< 13271 1727203848.82280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.82283: stderr chunk (state=3): >>><<< 13271 1727203848.82285: stdout chunk (state=3): >>><<< 13271 1727203848.82325: done transferring module to remote 13271 1727203848.82336: _low_level_execute_command(): starting 13271 1727203848.82341: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/ /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/AnsiballZ_command.py && sleep 0' 13271 1727203848.82945: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203848.82954: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203848.82968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203848.82985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203848.83096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.83132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.83221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203848.85264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203848.85269: stdout chunk (state=3): >>><<< 13271 1727203848.85271: stderr chunk (state=3): >>><<< 13271 1727203848.85380: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203848.85384: _low_level_execute_command(): starting 13271 1727203848.85387: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/AnsiballZ_command.py && sleep 0' 13271 1727203848.85990: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203848.86036: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203848.86138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.03642: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:50:49.025507", "end": "2024-09-24 14:50:49.033872", "delta": "0:00:00.008365", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203849.05411: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. <<< 13271 1727203849.05415: stdout chunk (state=3): >>><<< 13271 1727203849.05418: stderr chunk (state=3): >>><<< 13271 1727203849.05558: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:50:49.025507", "end": "2024-09-24 14:50:49.033872", "delta": "0:00:00.008365", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.14.47 closed. 13271 1727203849.05563: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203849.05566: _low_level_execute_command(): starting 13271 1727203849.05569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203848.741822-15418-256129017413872/ > /dev/null 2>&1 && sleep 0' 13271 1727203849.06105: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.06119: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.06134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.06160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203849.06189: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 13271 1727203849.06199: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203849.06272: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.06471: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.06595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.08707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.08732: stdout chunk (state=3): >>><<< 13271 1727203849.08749: stderr chunk (state=3): >>><<< 13271 1727203849.08771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.08786: handler run complete 13271 1727203849.08812: Evaluated conditional (False): False 13271 1727203849.08831: Evaluated conditional (False): False 13271 1727203849.08848: attempt loop complete, returning result 13271 1727203849.08860: _execute() done 13271 1727203849.08869: dumping result to json 13271 1727203849.08880: done dumping result, returning 13271 1727203849.08892: done running TaskExecutor() for managed-node1/TASK: Delete the device 'nm-bond' [028d2410-947f-2a40-12ba-0000000000c1] 13271 1727203849.08899: sending task result for task 028d2410-947f-2a40-12ba-0000000000c1 ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.008365", "end": "2024-09-24 14:50:49.033872", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:50:49.025507" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 13271 1727203849.09298: no more pending results, returning what we have 13271 1727203849.09301: results queue empty 13271 1727203849.09303: checking for any_errors_fatal 13271 1727203849.09304: done checking for any_errors_fatal 13271 1727203849.09305: checking for max_fail_percentage 13271 1727203849.09307: done checking for max_fail_percentage 13271 1727203849.09308: checking to see if all hosts have failed and the running result is not ok 13271 1727203849.09309: done checking to see if all hosts have failed 13271 1727203849.09310: getting the remaining hosts for this loop 13271 1727203849.09312: done getting the remaining hosts for this loop 13271 1727203849.09316: getting the next task for host managed-node1 13271 1727203849.09323: done getting next task for host managed-node1 13271 1727203849.09326: ^ task is: TASK: Remove test interfaces 13271 1727203849.09329: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203849.09334: getting variables 13271 1727203849.09336: in VariableManager get_vars() 13271 1727203849.09387: Calling all_inventory to load vars for managed-node1 13271 1727203849.09390: Calling groups_inventory to load vars for managed-node1 13271 1727203849.09393: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203849.09401: done sending task result for task 028d2410-947f-2a40-12ba-0000000000c1 13271 1727203849.09404: WORKER PROCESS EXITING 13271 1727203849.09415: Calling all_plugins_play to load vars for managed-node1 13271 1727203849.09419: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203849.09423: Calling groups_plugins_play to load vars for managed-node1 13271 1727203849.11491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203849.13486: done with get_vars() 13271 1727203849.13513: done getting variables 13271 1727203849.13582: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:50:49 -0400 (0:00:00.455) 0:00:32.779 ***** 13271 1727203849.13620: entering _queue_task() for managed-node1/shell 13271 1727203849.14198: worker is 1 (out of 1 available) 13271 1727203849.14207: exiting _queue_task() for managed-node1/shell 13271 1727203849.14219: done queuing things up, now waiting for results queue to drain 13271 1727203849.14221: waiting for pending results... 13271 1727203849.14324: running TaskExecutor() for managed-node1/TASK: Remove test interfaces 13271 1727203849.14492: in run() - task 028d2410-947f-2a40-12ba-0000000000c5 13271 1727203849.14514: variable 'ansible_search_path' from source: unknown 13271 1727203849.14523: variable 'ansible_search_path' from source: unknown 13271 1727203849.14575: calling self._execute() 13271 1727203849.14688: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203849.14703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203849.14720: variable 'omit' from source: magic vars 13271 1727203849.15108: variable 'ansible_distribution_major_version' from source: facts 13271 1727203849.15127: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203849.15140: variable 'omit' from source: magic vars 13271 1727203849.15207: variable 'omit' from source: magic vars 13271 1727203849.15371: variable 'dhcp_interface1' from source: play vars 13271 1727203849.15383: variable 'dhcp_interface2' from source: play vars 13271 1727203849.15403: variable 'omit' from source: magic vars 13271 1727203849.15448: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203849.15490: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203849.15512: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203849.15537: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203849.15645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203849.15649: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203849.15651: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203849.15653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203849.15691: Set connection var ansible_connection to ssh 13271 1727203849.15703: Set connection var ansible_shell_type to sh 13271 1727203849.15714: Set connection var ansible_timeout to 10 13271 1727203849.15722: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203849.15729: Set connection var ansible_pipelining to False 13271 1727203849.15736: Set connection var ansible_shell_executable to /bin/sh 13271 1727203849.15769: variable 'ansible_shell_executable' from source: unknown 13271 1727203849.15777: variable 'ansible_connection' from source: unknown 13271 1727203849.15783: variable 'ansible_module_compression' from source: unknown 13271 1727203849.15788: variable 'ansible_shell_type' from source: unknown 13271 1727203849.15793: variable 'ansible_shell_executable' from source: unknown 13271 1727203849.15798: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203849.15805: variable 'ansible_pipelining' from source: unknown 13271 1727203849.15811: variable 'ansible_timeout' from source: unknown 13271 1727203849.15819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203849.15947: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203849.15972: variable 'omit' from source: magic vars 13271 1727203849.15990: starting attempt loop 13271 1727203849.15998: running the handler 13271 1727203849.16012: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203849.16083: _low_level_execute_command(): starting 13271 1727203849.16086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203849.16790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.16804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.16821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.16940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203849.16951: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.16978: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.17104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.18911: stdout chunk (state=3): >>>/root <<< 13271 1727203849.19070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.19073: stdout chunk (state=3): >>><<< 13271 1727203849.19078: stderr chunk (state=3): >>><<< 13271 1727203849.19102: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.19124: _low_level_execute_command(): starting 13271 1727203849.19137: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400 `" && echo ansible-tmp-1727203849.1910915-15447-173308573363400="` echo /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400 `" ) && sleep 0' 13271 1727203849.19779: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.19800: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.19812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.19834: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203849.19849: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203849.19890: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203849.19960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.19987: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.20018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.20100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.22257: stdout chunk (state=3): >>>ansible-tmp-1727203849.1910915-15447-173308573363400=/root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400 <<< 13271 1727203849.22396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.22420: stderr chunk (state=3): >>><<< 13271 1727203849.22444: stdout chunk (state=3): >>><<< 13271 1727203849.22681: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203849.1910915-15447-173308573363400=/root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.22684: variable 'ansible_module_compression' from source: unknown 13271 1727203849.22687: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203849.22689: variable 'ansible_facts' from source: unknown 13271 1727203849.22691: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/AnsiballZ_command.py 13271 1727203849.22942: Sending initial data 13271 1727203849.22946: Sent initial data (156 bytes) 13271 1727203849.23531: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.23547: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.23568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.23688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.23710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.23820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.25648: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13271 1727203849.25675: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203849.25749: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203849.25844: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmprjk999u4 /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/AnsiballZ_command.py <<< 13271 1727203849.25847: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/AnsiballZ_command.py" <<< 13271 1727203849.25908: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmprjk999u4" to remote "/root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/AnsiballZ_command.py" <<< 13271 1727203849.26993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.26996: stdout chunk (state=3): >>><<< 13271 1727203849.26999: stderr chunk (state=3): >>><<< 13271 1727203849.27001: done transferring module to remote 13271 1727203849.27003: _low_level_execute_command(): starting 13271 1727203849.27006: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/ /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/AnsiballZ_command.py && sleep 0' 13271 1727203849.27614: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.27630: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.27653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.27753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.27785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.27799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.27907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.30081: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.30085: stdout chunk (state=3): >>><<< 13271 1727203849.30087: stderr chunk (state=3): >>><<< 13271 1727203849.30090: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.30092: _low_level_execute_command(): starting 13271 1727203849.30095: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/AnsiballZ_command.py && sleep 0' 13271 1727203849.30759: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.30774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.30798: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.30921: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.51101: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:50:49.471515", "end": "2024-09-24 14:50:49.508727", "delta": "0:00:00.037212", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203849.53147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.53161: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 13271 1727203849.53189: stdout chunk (state=3): >>><<< 13271 1727203849.53193: stderr chunk (state=3): >>><<< 13271 1727203849.53342: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:50:49.471515", "end": "2024-09-24 14:50:49.508727", "delta": "0:00:00.037212", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203849.53346: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203849.53355: _low_level_execute_command(): starting 13271 1727203849.53358: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203849.1910915-15447-173308573363400/ > /dev/null 2>&1 && sleep 0' 13271 1727203849.53925: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.53936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.53948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.53988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203849.54062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.54080: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.54105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.54224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.56481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.56485: stdout chunk (state=3): >>><<< 13271 1727203849.56486: stderr chunk (state=3): >>><<< 13271 1727203849.56488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.56490: handler run complete 13271 1727203849.56492: Evaluated conditional (False): False 13271 1727203849.56493: attempt loop complete, returning result 13271 1727203849.56495: _execute() done 13271 1727203849.56496: dumping result to json 13271 1727203849.56498: done dumping result, returning 13271 1727203849.56499: done running TaskExecutor() for managed-node1/TASK: Remove test interfaces [028d2410-947f-2a40-12ba-0000000000c5] 13271 1727203849.56501: sending task result for task 028d2410-947f-2a40-12ba-0000000000c5 13271 1727203849.56566: done sending task result for task 028d2410-947f-2a40-12ba-0000000000c5 13271 1727203849.56570: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.037212", "end": "2024-09-24 14:50:49.508727", "rc": 0, "start": "2024-09-24 14:50:49.471515" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 13271 1727203849.56644: no more pending results, returning what we have 13271 1727203849.56648: results queue empty 13271 1727203849.56649: checking for any_errors_fatal 13271 1727203849.56663: done checking for any_errors_fatal 13271 1727203849.56664: checking for max_fail_percentage 13271 1727203849.56666: done checking for max_fail_percentage 13271 1727203849.56667: checking to see if all hosts have failed and the running result is not ok 13271 1727203849.56669: done checking to see if all hosts have failed 13271 1727203849.56669: getting the remaining hosts for this loop 13271 1727203849.56671: done getting the remaining hosts for this loop 13271 1727203849.56785: getting the next task for host managed-node1 13271 1727203849.56794: done getting next task for host managed-node1 13271 1727203849.56797: ^ task is: TASK: Stop dnsmasq/radvd services 13271 1727203849.56801: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203849.56805: getting variables 13271 1727203849.56807: in VariableManager get_vars() 13271 1727203849.56845: Calling all_inventory to load vars for managed-node1 13271 1727203849.56848: Calling groups_inventory to load vars for managed-node1 13271 1727203849.56851: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203849.56865: Calling all_plugins_play to load vars for managed-node1 13271 1727203849.56869: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203849.56872: Calling groups_plugins_play to load vars for managed-node1 13271 1727203849.58440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203849.60055: done with get_vars() 13271 1727203849.60083: done getting variables 13271 1727203849.60143: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:50:49 -0400 (0:00:00.465) 0:00:33.245 ***** 13271 1727203849.60189: entering _queue_task() for managed-node1/shell 13271 1727203849.60538: worker is 1 (out of 1 available) 13271 1727203849.60551: exiting _queue_task() for managed-node1/shell 13271 1727203849.60566: done queuing things up, now waiting for results queue to drain 13271 1727203849.60568: waiting for pending results... 13271 1727203849.61021: running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services 13271 1727203849.61026: in run() - task 028d2410-947f-2a40-12ba-0000000000c6 13271 1727203849.61029: variable 'ansible_search_path' from source: unknown 13271 1727203849.61032: variable 'ansible_search_path' from source: unknown 13271 1727203849.61052: calling self._execute() 13271 1727203849.61151: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203849.61154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203849.61168: variable 'omit' from source: magic vars 13271 1727203849.61881: variable 'ansible_distribution_major_version' from source: facts 13271 1727203849.61895: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203849.61902: variable 'omit' from source: magic vars 13271 1727203849.61952: variable 'omit' from source: magic vars 13271 1727203849.62120: variable 'omit' from source: magic vars 13271 1727203849.62155: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203849.62192: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203849.62211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203849.62298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203849.62309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203849.62348: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203849.62352: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203849.62354: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203849.62461: Set connection var ansible_connection to ssh 13271 1727203849.62472: Set connection var ansible_shell_type to sh 13271 1727203849.62481: Set connection var ansible_timeout to 10 13271 1727203849.62487: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203849.62492: Set connection var ansible_pipelining to False 13271 1727203849.62498: Set connection var ansible_shell_executable to /bin/sh 13271 1727203849.62524: variable 'ansible_shell_executable' from source: unknown 13271 1727203849.62527: variable 'ansible_connection' from source: unknown 13271 1727203849.62530: variable 'ansible_module_compression' from source: unknown 13271 1727203849.62533: variable 'ansible_shell_type' from source: unknown 13271 1727203849.62535: variable 'ansible_shell_executable' from source: unknown 13271 1727203849.62537: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203849.62780: variable 'ansible_pipelining' from source: unknown 13271 1727203849.62784: variable 'ansible_timeout' from source: unknown 13271 1727203849.62787: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203849.62790: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203849.62793: variable 'omit' from source: magic vars 13271 1727203849.62795: starting attempt loop 13271 1727203849.62797: running the handler 13271 1727203849.62800: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203849.62803: _low_level_execute_command(): starting 13271 1727203849.62805: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203849.63563: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.63580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.63674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203849.63701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.63713: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.63732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.63840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.65668: stdout chunk (state=3): >>>/root <<< 13271 1727203849.65833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.65837: stdout chunk (state=3): >>><<< 13271 1727203849.65839: stderr chunk (state=3): >>><<< 13271 1727203849.65866: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.65986: _low_level_execute_command(): starting 13271 1727203849.65992: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755 `" && echo ansible-tmp-1727203849.6588168-15505-256878664925755="` echo /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755 `" ) && sleep 0' 13271 1727203849.66559: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.66565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.66644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.66657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203849.66728: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.66756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.66783: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.66969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.69319: stdout chunk (state=3): >>>ansible-tmp-1727203849.6588168-15505-256878664925755=/root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755 <<< 13271 1727203849.69323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.69326: stdout chunk (state=3): >>><<< 13271 1727203849.69328: stderr chunk (state=3): >>><<< 13271 1727203849.69335: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203849.6588168-15505-256878664925755=/root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.69441: variable 'ansible_module_compression' from source: unknown 13271 1727203849.69538: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203849.69716: variable 'ansible_facts' from source: unknown 13271 1727203849.69837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/AnsiballZ_command.py 13271 1727203849.70068: Sending initial data 13271 1727203849.70074: Sent initial data (156 bytes) 13271 1727203849.70672: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.70736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.70749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203849.70814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.70830: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.70852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.70962: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.72797: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203849.72814: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203849.72893: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmpcacz8xxi /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/AnsiballZ_command.py <<< 13271 1727203849.72897: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/AnsiballZ_command.py" <<< 13271 1727203849.72977: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmpcacz8xxi" to remote "/root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/AnsiballZ_command.py" <<< 13271 1727203849.74024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.74028: stdout chunk (state=3): >>><<< 13271 1727203849.74030: stderr chunk (state=3): >>><<< 13271 1727203849.74032: done transferring module to remote 13271 1727203849.74034: _low_level_execute_command(): starting 13271 1727203849.74037: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/ /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/AnsiballZ_command.py && sleep 0' 13271 1727203849.74695: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.74709: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.74721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.74783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.74854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203849.74892: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.74911: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.74957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.75047: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.77079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203849.77083: stdout chunk (state=3): >>><<< 13271 1727203849.77085: stderr chunk (state=3): >>><<< 13271 1727203849.77185: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203849.77188: _low_level_execute_command(): starting 13271 1727203849.77191: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/AnsiballZ_command.py && sleep 0' 13271 1727203849.78025: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203849.78029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203849.78031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.78034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203849.78046: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203849.78180: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203849.78184: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203849.78186: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203849.78189: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203849.78193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203849.78195: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203849.78197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203849.78303: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203849.97818: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:50:49.947069", "end": "2024-09-24 14:50:49.974969", "delta": "0:00:00.027900", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203849.99596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203849.99600: stdout chunk (state=3): >>><<< 13271 1727203849.99602: stderr chunk (state=3): >>><<< 13271 1727203849.99625: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:50:49.947069", "end": "2024-09-24 14:50:49.974969", "delta": "0:00:00.027900", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203849.99769: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203849.99773: _low_level_execute_command(): starting 13271 1727203849.99779: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203849.6588168-15505-256878664925755/ > /dev/null 2>&1 && sleep 0' 13271 1727203850.00390: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.00404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.00418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.00447: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203850.00469: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203850.00559: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.00598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.00616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.00639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.00773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.02920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.03125: stderr chunk (state=3): >>><<< 13271 1727203850.03128: stdout chunk (state=3): >>><<< 13271 1727203850.03131: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.03133: handler run complete 13271 1727203850.03135: Evaluated conditional (False): False 13271 1727203850.03137: attempt loop complete, returning result 13271 1727203850.03139: _execute() done 13271 1727203850.03140: dumping result to json 13271 1727203850.03142: done dumping result, returning 13271 1727203850.03143: done running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services [028d2410-947f-2a40-12ba-0000000000c6] 13271 1727203850.03145: sending task result for task 028d2410-947f-2a40-12ba-0000000000c6 ok: [managed-node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.027900", "end": "2024-09-24 14:50:49.974969", "rc": 0, "start": "2024-09-24 14:50:49.947069" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 13271 1727203850.03477: no more pending results, returning what we have 13271 1727203850.03480: results queue empty 13271 1727203850.03481: checking for any_errors_fatal 13271 1727203850.03491: done checking for any_errors_fatal 13271 1727203850.03492: checking for max_fail_percentage 13271 1727203850.03494: done checking for max_fail_percentage 13271 1727203850.03495: checking to see if all hosts have failed and the running result is not ok 13271 1727203850.03496: done checking to see if all hosts have failed 13271 1727203850.03496: getting the remaining hosts for this loop 13271 1727203850.03498: done getting the remaining hosts for this loop 13271 1727203850.03501: getting the next task for host managed-node1 13271 1727203850.03509: done getting next task for host managed-node1 13271 1727203850.03512: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 13271 1727203850.03515: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203850.03518: getting variables 13271 1727203850.03520: in VariableManager get_vars() 13271 1727203850.03690: Calling all_inventory to load vars for managed-node1 13271 1727203850.03693: Calling groups_inventory to load vars for managed-node1 13271 1727203850.03695: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203850.03788: Calling all_plugins_play to load vars for managed-node1 13271 1727203850.03792: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203850.03800: Calling groups_plugins_play to load vars for managed-node1 13271 1727203850.04603: done sending task result for task 028d2410-947f-2a40-12ba-0000000000c6 13271 1727203850.04607: WORKER PROCESS EXITING 13271 1727203850.05763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203850.07396: done with get_vars() 13271 1727203850.07419: done getting variables 13271 1727203850.07492: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Tuesday 24 September 2024 14:50:50 -0400 (0:00:00.473) 0:00:33.718 ***** 13271 1727203850.07523: entering _queue_task() for managed-node1/command 13271 1727203850.07911: worker is 1 (out of 1 available) 13271 1727203850.07925: exiting _queue_task() for managed-node1/command 13271 1727203850.07938: done queuing things up, now waiting for results queue to drain 13271 1727203850.07939: waiting for pending results... 13271 1727203850.08394: running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript 13271 1727203850.08399: in run() - task 028d2410-947f-2a40-12ba-0000000000c7 13271 1727203850.08414: variable 'ansible_search_path' from source: unknown 13271 1727203850.08448: calling self._execute() 13271 1727203850.08549: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.08553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.08567: variable 'omit' from source: magic vars 13271 1727203850.08941: variable 'ansible_distribution_major_version' from source: facts 13271 1727203850.08953: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203850.09071: variable 'network_provider' from source: set_fact 13271 1727203850.09076: Evaluated conditional (network_provider == "initscripts"): False 13271 1727203850.09079: when evaluation is False, skipping this task 13271 1727203850.09082: _execute() done 13271 1727203850.09085: dumping result to json 13271 1727203850.09091: done dumping result, returning 13271 1727203850.09094: done running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript [028d2410-947f-2a40-12ba-0000000000c7] 13271 1727203850.09101: sending task result for task 028d2410-947f-2a40-12ba-0000000000c7 skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 13271 1727203850.09234: no more pending results, returning what we have 13271 1727203850.09238: results queue empty 13271 1727203850.09239: checking for any_errors_fatal 13271 1727203850.09250: done checking for any_errors_fatal 13271 1727203850.09250: checking for max_fail_percentage 13271 1727203850.09252: done checking for max_fail_percentage 13271 1727203850.09253: checking to see if all hosts have failed and the running result is not ok 13271 1727203850.09254: done checking to see if all hosts have failed 13271 1727203850.09254: getting the remaining hosts for this loop 13271 1727203850.09256: done getting the remaining hosts for this loop 13271 1727203850.09258: getting the next task for host managed-node1 13271 1727203850.09266: done getting next task for host managed-node1 13271 1727203850.09270: ^ task is: TASK: Verify network state restored to default 13271 1727203850.09273: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203850.09278: getting variables 13271 1727203850.09280: in VariableManager get_vars() 13271 1727203850.09320: Calling all_inventory to load vars for managed-node1 13271 1727203850.09323: Calling groups_inventory to load vars for managed-node1 13271 1727203850.09325: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203850.09337: Calling all_plugins_play to load vars for managed-node1 13271 1727203850.09339: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203850.09342: Calling groups_plugins_play to load vars for managed-node1 13271 1727203850.09889: done sending task result for task 028d2410-947f-2a40-12ba-0000000000c7 13271 1727203850.09892: WORKER PROCESS EXITING 13271 1727203850.10849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203850.12474: done with get_vars() 13271 1727203850.12503: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Tuesday 24 September 2024 14:50:50 -0400 (0:00:00.050) 0:00:33.769 ***** 13271 1727203850.12597: entering _queue_task() for managed-node1/include_tasks 13271 1727203850.12956: worker is 1 (out of 1 available) 13271 1727203850.12974: exiting _queue_task() for managed-node1/include_tasks 13271 1727203850.12989: done queuing things up, now waiting for results queue to drain 13271 1727203850.12991: waiting for pending results... 13271 1727203850.13255: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 13271 1727203850.13392: in run() - task 028d2410-947f-2a40-12ba-0000000000c8 13271 1727203850.13482: variable 'ansible_search_path' from source: unknown 13271 1727203850.13486: calling self._execute() 13271 1727203850.13546: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.13559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.13574: variable 'omit' from source: magic vars 13271 1727203850.13963: variable 'ansible_distribution_major_version' from source: facts 13271 1727203850.13980: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203850.13989: _execute() done 13271 1727203850.13995: dumping result to json 13271 1727203850.14001: done dumping result, returning 13271 1727203850.14007: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [028d2410-947f-2a40-12ba-0000000000c8] 13271 1727203850.14015: sending task result for task 028d2410-947f-2a40-12ba-0000000000c8 13271 1727203850.14262: done sending task result for task 028d2410-947f-2a40-12ba-0000000000c8 13271 1727203850.14265: WORKER PROCESS EXITING 13271 1727203850.14304: no more pending results, returning what we have 13271 1727203850.14309: in VariableManager get_vars() 13271 1727203850.14353: Calling all_inventory to load vars for managed-node1 13271 1727203850.14355: Calling groups_inventory to load vars for managed-node1 13271 1727203850.14357: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203850.14371: Calling all_plugins_play to load vars for managed-node1 13271 1727203850.14373: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203850.14377: Calling groups_plugins_play to load vars for managed-node1 13271 1727203850.16041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203850.17620: done with get_vars() 13271 1727203850.17638: variable 'ansible_search_path' from source: unknown 13271 1727203850.17654: we have included files to process 13271 1727203850.17655: generating all_blocks data 13271 1727203850.17657: done generating all_blocks data 13271 1727203850.17663: processing included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13271 1727203850.17664: loading included file: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13271 1727203850.17666: Loading data from /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 13271 1727203850.18067: done processing included file 13271 1727203850.18070: iterating over new_blocks loaded from include file 13271 1727203850.18071: in VariableManager get_vars() 13271 1727203850.18097: done with get_vars() 13271 1727203850.18098: filtering new block on tags 13271 1727203850.18136: done filtering new block on tags 13271 1727203850.18138: done iterating over new_blocks loaded from include file included: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 13271 1727203850.18143: extending task lists for all hosts with included blocks 13271 1727203850.19778: done extending task lists 13271 1727203850.19780: done processing included files 13271 1727203850.19781: results queue empty 13271 1727203850.19782: checking for any_errors_fatal 13271 1727203850.19785: done checking for any_errors_fatal 13271 1727203850.19786: checking for max_fail_percentage 13271 1727203850.19787: done checking for max_fail_percentage 13271 1727203850.19788: checking to see if all hosts have failed and the running result is not ok 13271 1727203850.19789: done checking to see if all hosts have failed 13271 1727203850.19789: getting the remaining hosts for this loop 13271 1727203850.19790: done getting the remaining hosts for this loop 13271 1727203850.19793: getting the next task for host managed-node1 13271 1727203850.19797: done getting next task for host managed-node1 13271 1727203850.19800: ^ task is: TASK: Check routes and DNS 13271 1727203850.19802: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203850.19805: getting variables 13271 1727203850.19806: in VariableManager get_vars() 13271 1727203850.19823: Calling all_inventory to load vars for managed-node1 13271 1727203850.19825: Calling groups_inventory to load vars for managed-node1 13271 1727203850.19827: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203850.19834: Calling all_plugins_play to load vars for managed-node1 13271 1727203850.19836: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203850.19840: Calling groups_plugins_play to load vars for managed-node1 13271 1727203850.21591: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203850.23828: done with get_vars() 13271 1727203850.23860: done getting variables 13271 1727203850.24025: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:50:50 -0400 (0:00:00.114) 0:00:33.883 ***** 13271 1727203850.24057: entering _queue_task() for managed-node1/shell 13271 1727203850.24557: worker is 1 (out of 1 available) 13271 1727203850.24570: exiting _queue_task() for managed-node1/shell 13271 1727203850.24730: done queuing things up, now waiting for results queue to drain 13271 1727203850.24733: waiting for pending results... 13271 1727203850.24897: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 13271 1727203850.25085: in run() - task 028d2410-947f-2a40-12ba-00000000056d 13271 1727203850.25271: variable 'ansible_search_path' from source: unknown 13271 1727203850.25276: variable 'ansible_search_path' from source: unknown 13271 1727203850.25279: calling self._execute() 13271 1727203850.25351: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.25361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.25384: variable 'omit' from source: magic vars 13271 1727203850.25765: variable 'ansible_distribution_major_version' from source: facts 13271 1727203850.25783: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203850.25793: variable 'omit' from source: magic vars 13271 1727203850.25848: variable 'omit' from source: magic vars 13271 1727203850.25890: variable 'omit' from source: magic vars 13271 1727203850.25939: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203850.25980: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203850.26005: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203850.26034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203850.26050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203850.26085: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203850.26093: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.26100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.26195: Set connection var ansible_connection to ssh 13271 1727203850.26206: Set connection var ansible_shell_type to sh 13271 1727203850.26216: Set connection var ansible_timeout to 10 13271 1727203850.26223: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203850.26232: Set connection var ansible_pipelining to False 13271 1727203850.26353: Set connection var ansible_shell_executable to /bin/sh 13271 1727203850.26357: variable 'ansible_shell_executable' from source: unknown 13271 1727203850.26359: variable 'ansible_connection' from source: unknown 13271 1727203850.26362: variable 'ansible_module_compression' from source: unknown 13271 1727203850.26364: variable 'ansible_shell_type' from source: unknown 13271 1727203850.26366: variable 'ansible_shell_executable' from source: unknown 13271 1727203850.26368: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.26370: variable 'ansible_pipelining' from source: unknown 13271 1727203850.26372: variable 'ansible_timeout' from source: unknown 13271 1727203850.26374: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.26467: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203850.26487: variable 'omit' from source: magic vars 13271 1727203850.26498: starting attempt loop 13271 1727203850.26505: running the handler 13271 1727203850.26519: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203850.26543: _low_level_execute_command(): starting 13271 1727203850.26555: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203850.27340: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203850.27381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.27463: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.27483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.27602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.29396: stdout chunk (state=3): >>>/root <<< 13271 1727203850.29549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.29553: stdout chunk (state=3): >>><<< 13271 1727203850.29555: stderr chunk (state=3): >>><<< 13271 1727203850.29592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.29801: _low_level_execute_command(): starting 13271 1727203850.29806: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397 `" && echo ansible-tmp-1727203850.2969258-15642-103619495190397="` echo /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397 `" ) && sleep 0' 13271 1727203850.30456: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.30459: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.30473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.30492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203850.30503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203850.30511: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203850.30521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.30538: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 13271 1727203850.30547: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.14.47 is address <<< 13271 1727203850.30553: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 13271 1727203850.30577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.30580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.30672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203850.30686: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203850.30692: stderr chunk (state=3): >>>debug2: match found <<< 13271 1727203850.30695: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.30697: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.30699: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.30722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.30834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.32932: stdout chunk (state=3): >>>ansible-tmp-1727203850.2969258-15642-103619495190397=/root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397 <<< 13271 1727203850.33107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.33110: stdout chunk (state=3): >>><<< 13271 1727203850.33113: stderr chunk (state=3): >>><<< 13271 1727203850.33134: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203850.2969258-15642-103619495190397=/root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.33201: variable 'ansible_module_compression' from source: unknown 13271 1727203850.33293: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203850.33323: variable 'ansible_facts' from source: unknown 13271 1727203850.33403: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/AnsiballZ_command.py 13271 1727203850.33666: Sending initial data 13271 1727203850.33669: Sent initial data (156 bytes) 13271 1727203850.34157: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.34170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.34182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.34284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.34344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.34407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.36311: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203850.36388: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203850.36452: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp9lkyt26a /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/AnsiballZ_command.py <<< 13271 1727203850.36455: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/AnsiballZ_command.py" <<< 13271 1727203850.36629: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp9lkyt26a" to remote "/root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/AnsiballZ_command.py" <<< 13271 1727203850.37991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.38071: stderr chunk (state=3): >>><<< 13271 1727203850.38091: stdout chunk (state=3): >>><<< 13271 1727203850.38117: done transferring module to remote 13271 1727203850.38177: _low_level_execute_command(): starting 13271 1727203850.38181: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/ /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/AnsiballZ_command.py && sleep 0' 13271 1727203850.38888: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.38905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.38932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.39056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.39103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.39192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.41264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.41293: stdout chunk (state=3): >>><<< 13271 1727203850.41295: stderr chunk (state=3): >>><<< 13271 1727203850.41392: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.41404: _low_level_execute_command(): starting 13271 1727203850.41407: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/AnsiballZ_command.py && sleep 0' 13271 1727203850.41957: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203850.42073: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.42090: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.42108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.42234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.59964: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3193sec preferred_lft 3193sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:50:50.587448", "end": "2024-09-24 14:50:50.596618", "delta": "0:00:00.009170", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203850.61641: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.61658: stderr chunk (state=3): >>>Shared connection to 10.31.14.47 closed. <<< 13271 1727203850.61723: stderr chunk (state=3): >>><<< 13271 1727203850.61737: stdout chunk (state=3): >>><<< 13271 1727203850.61783: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3193sec preferred_lft 3193sec\n inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:50:50.587448", "end": "2024-09-24 14:50:50.596618", "delta": "0:00:00.009170", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203850.61847: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203850.61881: _low_level_execute_command(): starting 13271 1727203850.61884: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203850.2969258-15642-103619495190397/ > /dev/null 2>&1 && sleep 0' 13271 1727203850.62570: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.62640: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.62683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.62722: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.62810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.64839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.64843: stderr chunk (state=3): >>><<< 13271 1727203850.64846: stdout chunk (state=3): >>><<< 13271 1727203850.64867: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.64870: handler run complete 13271 1727203850.65128: Evaluated conditional (False): False 13271 1727203850.65130: attempt loop complete, returning result 13271 1727203850.65132: _execute() done 13271 1727203850.65134: dumping result to json 13271 1727203850.65135: done dumping result, returning 13271 1727203850.65137: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [028d2410-947f-2a40-12ba-00000000056d] 13271 1727203850.65139: sending task result for task 028d2410-947f-2a40-12ba-00000000056d 13271 1727203850.65213: done sending task result for task 028d2410-947f-2a40-12ba-00000000056d 13271 1727203850.65217: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009170", "end": "2024-09-24 14:50:50.596618", "rc": 0, "start": "2024-09-24 14:50:50.587448" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:dd:89:9b:e5 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.14.47/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3193sec preferred_lft 3193sec inet6 fe80::8ff:ddff:fe89:9be5/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.14.47 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.14.47 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 13271 1727203850.65289: no more pending results, returning what we have 13271 1727203850.65297: results queue empty 13271 1727203850.65299: checking for any_errors_fatal 13271 1727203850.65301: done checking for any_errors_fatal 13271 1727203850.65301: checking for max_fail_percentage 13271 1727203850.65303: done checking for max_fail_percentage 13271 1727203850.65304: checking to see if all hosts have failed and the running result is not ok 13271 1727203850.65305: done checking to see if all hosts have failed 13271 1727203850.65306: getting the remaining hosts for this loop 13271 1727203850.65307: done getting the remaining hosts for this loop 13271 1727203850.65311: getting the next task for host managed-node1 13271 1727203850.65317: done getting next task for host managed-node1 13271 1727203850.65319: ^ task is: TASK: Verify DNS and network connectivity 13271 1727203850.65322: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 13271 1727203850.65329: getting variables 13271 1727203850.65336: in VariableManager get_vars() 13271 1727203850.65481: Calling all_inventory to load vars for managed-node1 13271 1727203850.65484: Calling groups_inventory to load vars for managed-node1 13271 1727203850.65487: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203850.65498: Calling all_plugins_play to load vars for managed-node1 13271 1727203850.65501: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203850.65504: Calling groups_plugins_play to load vars for managed-node1 13271 1727203850.66900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203850.68493: done with get_vars() 13271 1727203850.68521: done getting variables 13271 1727203850.68591: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:50:50 -0400 (0:00:00.445) 0:00:34.329 ***** 13271 1727203850.68627: entering _queue_task() for managed-node1/shell 13271 1727203850.69199: worker is 1 (out of 1 available) 13271 1727203850.69209: exiting _queue_task() for managed-node1/shell 13271 1727203850.69219: done queuing things up, now waiting for results queue to drain 13271 1727203850.69221: waiting for pending results... 13271 1727203850.69343: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 13271 1727203850.69783: in run() - task 028d2410-947f-2a40-12ba-00000000056e 13271 1727203850.69788: variable 'ansible_search_path' from source: unknown 13271 1727203850.69792: variable 'ansible_search_path' from source: unknown 13271 1727203850.69795: calling self._execute() 13271 1727203850.69798: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.69802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.69805: variable 'omit' from source: magic vars 13271 1727203850.70039: variable 'ansible_distribution_major_version' from source: facts 13271 1727203850.70050: Evaluated conditional (ansible_distribution_major_version != '6'): True 13271 1727203850.70204: variable 'ansible_facts' from source: unknown 13271 1727203850.71025: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 13271 1727203850.71029: variable 'omit' from source: magic vars 13271 1727203850.71089: variable 'omit' from source: magic vars 13271 1727203850.71129: variable 'omit' from source: magic vars 13271 1727203850.71177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 13271 1727203850.71216: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 13271 1727203850.71242: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 13271 1727203850.71259: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203850.71277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 13271 1727203850.71313: variable 'inventory_hostname' from source: host vars for 'managed-node1' 13271 1727203850.71316: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.71319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.71442: Set connection var ansible_connection to ssh 13271 1727203850.71445: Set connection var ansible_shell_type to sh 13271 1727203850.71448: Set connection var ansible_timeout to 10 13271 1727203850.71454: Set connection var ansible_module_compression to ZIP_DEFLATED 13271 1727203850.71460: Set connection var ansible_pipelining to False 13271 1727203850.71468: Set connection var ansible_shell_executable to /bin/sh 13271 1727203850.71499: variable 'ansible_shell_executable' from source: unknown 13271 1727203850.71502: variable 'ansible_connection' from source: unknown 13271 1727203850.71510: variable 'ansible_module_compression' from source: unknown 13271 1727203850.71513: variable 'ansible_shell_type' from source: unknown 13271 1727203850.71515: variable 'ansible_shell_executable' from source: unknown 13271 1727203850.71517: variable 'ansible_host' from source: host vars for 'managed-node1' 13271 1727203850.71522: variable 'ansible_pipelining' from source: unknown 13271 1727203850.71524: variable 'ansible_timeout' from source: unknown 13271 1727203850.71529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 13271 1727203850.71685: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203850.71696: variable 'omit' from source: magic vars 13271 1727203850.71761: starting attempt loop 13271 1727203850.71764: running the handler 13271 1727203850.71767: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 13271 1727203850.71770: _low_level_execute_command(): starting 13271 1727203850.71772: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 13271 1727203850.72537: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.72600: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.72663: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.72721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.72809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.74883: stdout chunk (state=3): >>>/root <<< 13271 1727203850.74886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.74895: stdout chunk (state=3): >>><<< 13271 1727203850.74910: stderr chunk (state=3): >>><<< 13271 1727203850.75031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.75035: _low_level_execute_command(): starting 13271 1727203850.75038: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093 `" && echo ansible-tmp-1727203850.7493947-15719-272971641466093="` echo /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093 `" ) && sleep 0' 13271 1727203850.75590: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.75612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.75636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.75736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.75761: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.75793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.75819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.75927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.78082: stdout chunk (state=3): >>>ansible-tmp-1727203850.7493947-15719-272971641466093=/root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093 <<< 13271 1727203850.78259: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.78263: stdout chunk (state=3): >>><<< 13271 1727203850.78265: stderr chunk (state=3): >>><<< 13271 1727203850.78392: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727203850.7493947-15719-272971641466093=/root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.78396: variable 'ansible_module_compression' from source: unknown 13271 1727203850.78418: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-13271t0gorrax/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 13271 1727203850.78474: variable 'ansible_facts' from source: unknown 13271 1727203850.78581: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/AnsiballZ_command.py 13271 1727203850.78865: Sending initial data 13271 1727203850.78867: Sent initial data (156 bytes) 13271 1727203850.79571: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.79674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.79693: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.79705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.79813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.81577: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 13271 1727203850.81594: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 13271 1727203850.81610: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 13271 1727203850.81628: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 13271 1727203850.81663: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 13271 1727203850.81753: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 13271 1727203850.81840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-13271t0gorrax/tmp51mm3xf7 /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/AnsiballZ_command.py <<< 13271 1727203850.81843: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/AnsiballZ_command.py" <<< 13271 1727203850.81915: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-13271t0gorrax/tmp51mm3xf7" to remote "/root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/AnsiballZ_command.py" <<< 13271 1727203850.82843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.82870: stderr chunk (state=3): >>><<< 13271 1727203850.82947: stdout chunk (state=3): >>><<< 13271 1727203850.82950: done transferring module to remote 13271 1727203850.82954: _low_level_execute_command(): starting 13271 1727203850.82956: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/ /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/AnsiballZ_command.py && sleep 0' 13271 1727203850.83866: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.83869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found <<< 13271 1727203850.83872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.83880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.83883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.83956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.84032: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203850.86070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203850.86074: stdout chunk (state=3): >>><<< 13271 1727203850.86079: stderr chunk (state=3): >>><<< 13271 1727203850.86098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203850.86107: _low_level_execute_command(): starting 13271 1727203850.86188: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/AnsiballZ_command.py && sleep 0' 13271 1727203850.86732: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 13271 1727203850.86745: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 13271 1727203850.86755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 13271 1727203850.86771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 13271 1727203850.86787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 <<< 13271 1727203850.86806: stderr chunk (state=3): >>>debug2: match not found <<< 13271 1727203850.86894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203850.86920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203850.86936: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203850.86958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203850.87082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203851.50608: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1365 0 --:--:-- --:--:-- --:--:-- 1361\r100 305 100 305 0 0 1364 0 --:--:-- --:--:-- --:--:-- 1361\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1320 0 --:--:-- --:--:-- --:--:-- 1322", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:50:51.033808", "end": "2024-09-24 14:50:51.503276", "delta": "0:00:00.469468", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 13271 1727203851.52268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. <<< 13271 1727203851.52292: stdout chunk (state=3): >>><<< 13271 1727203851.52311: stderr chunk (state=3): >>><<< 13271 1727203851.52401: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1365 0 --:--:-- --:--:-- --:--:-- 1361\r100 305 100 305 0 0 1364 0 --:--:-- --:--:-- --:--:-- 1361\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1320 0 --:--:-- --:--:-- --:--:-- 1322", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:50:51.033808", "end": "2024-09-24 14:50:51.503276", "delta": "0:00:00.469468", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.14.47 closed. 13271 1727203851.52410: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 13271 1727203851.52422: _low_level_execute_command(): starting 13271 1727203851.52434: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727203850.7493947-15719-272971641466093/ > /dev/null 2>&1 && sleep 0' 13271 1727203851.53179: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 13271 1727203851.53207: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' <<< 13271 1727203851.53225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 13271 1727203851.53246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 13271 1727203851.53363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 13271 1727203851.55737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 13271 1727203851.55741: stdout chunk (state=3): >>><<< 13271 1727203851.55743: stderr chunk (state=3): >>><<< 13271 1727203851.55745: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.14.47 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.14.47 originally 10.31.14.47 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/a0f5415566' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 13271 1727203851.55747: handler run complete 13271 1727203851.55749: Evaluated conditional (False): False 13271 1727203851.55751: attempt loop complete, returning result 13271 1727203851.55753: _execute() done 13271 1727203851.55755: dumping result to json 13271 1727203851.55757: done dumping result, returning 13271 1727203851.55758: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [028d2410-947f-2a40-12ba-00000000056e] 13271 1727203851.55763: sending task result for task 028d2410-947f-2a40-12ba-00000000056e ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.469468", "end": "2024-09-24 14:50:51.503276", "rc": 0, "start": "2024-09-24 14:50:51.033808" } STDOUT: CHECK DNS AND CONNECTIVITY 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1365 0 --:--:-- --:--:-- --:--:-- 1361 100 305 100 305 0 0 1364 0 --:--:-- --:--:-- --:--:-- 1361 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1320 0 --:--:-- --:--:-- --:--:-- 1322 13271 1727203851.56035: no more pending results, returning what we have 13271 1727203851.56038: results queue empty 13271 1727203851.56039: checking for any_errors_fatal 13271 1727203851.56048: done checking for any_errors_fatal 13271 1727203851.56049: checking for max_fail_percentage 13271 1727203851.56051: done checking for max_fail_percentage 13271 1727203851.56052: checking to see if all hosts have failed and the running result is not ok 13271 1727203851.56053: done checking to see if all hosts have failed 13271 1727203851.56054: getting the remaining hosts for this loop 13271 1727203851.56055: done getting the remaining hosts for this loop 13271 1727203851.56059: getting the next task for host managed-node1 13271 1727203851.56080: done getting next task for host managed-node1 13271 1727203851.56084: ^ task is: TASK: meta (flush_handlers) 13271 1727203851.56087: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203851.56096: getting variables 13271 1727203851.56098: in VariableManager get_vars() 13271 1727203851.56142: Calling all_inventory to load vars for managed-node1 13271 1727203851.56145: Calling groups_inventory to load vars for managed-node1 13271 1727203851.56148: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203851.56159: Calling all_plugins_play to load vars for managed-node1 13271 1727203851.56167: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203851.56171: Calling groups_plugins_play to load vars for managed-node1 13271 1727203851.57126: done sending task result for task 028d2410-947f-2a40-12ba-00000000056e 13271 1727203851.57130: WORKER PROCESS EXITING 13271 1727203851.59023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203851.60680: done with get_vars() 13271 1727203851.60707: done getting variables 13271 1727203851.60790: in VariableManager get_vars() 13271 1727203851.60807: Calling all_inventory to load vars for managed-node1 13271 1727203851.60809: Calling groups_inventory to load vars for managed-node1 13271 1727203851.60812: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203851.60817: Calling all_plugins_play to load vars for managed-node1 13271 1727203851.60819: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203851.60822: Calling groups_plugins_play to load vars for managed-node1 13271 1727203851.62005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203851.63806: done with get_vars() 13271 1727203851.63840: done queuing things up, now waiting for results queue to drain 13271 1727203851.63842: results queue empty 13271 1727203851.63848: checking for any_errors_fatal 13271 1727203851.63854: done checking for any_errors_fatal 13271 1727203851.63855: checking for max_fail_percentage 13271 1727203851.63856: done checking for max_fail_percentage 13271 1727203851.63857: checking to see if all hosts have failed and the running result is not ok 13271 1727203851.63857: done checking to see if all hosts have failed 13271 1727203851.63858: getting the remaining hosts for this loop 13271 1727203851.63859: done getting the remaining hosts for this loop 13271 1727203851.63865: getting the next task for host managed-node1 13271 1727203851.63869: done getting next task for host managed-node1 13271 1727203851.63870: ^ task is: TASK: meta (flush_handlers) 13271 1727203851.63872: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203851.63877: getting variables 13271 1727203851.63878: in VariableManager get_vars() 13271 1727203851.63894: Calling all_inventory to load vars for managed-node1 13271 1727203851.63896: Calling groups_inventory to load vars for managed-node1 13271 1727203851.63898: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203851.63904: Calling all_plugins_play to load vars for managed-node1 13271 1727203851.63907: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203851.63909: Calling groups_plugins_play to load vars for managed-node1 13271 1727203851.65217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203851.66782: done with get_vars() 13271 1727203851.66807: done getting variables 13271 1727203851.66863: in VariableManager get_vars() 13271 1727203851.66886: Calling all_inventory to load vars for managed-node1 13271 1727203851.66888: Calling groups_inventory to load vars for managed-node1 13271 1727203851.66891: Calling all_plugins_inventory to load vars for managed-node1 13271 1727203851.66896: Calling all_plugins_play to load vars for managed-node1 13271 1727203851.66898: Calling groups_plugins_inventory to load vars for managed-node1 13271 1727203851.66901: Calling groups_plugins_play to load vars for managed-node1 13271 1727203851.68125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 13271 1727203851.70319: done with get_vars() 13271 1727203851.70348: done queuing things up, now waiting for results queue to drain 13271 1727203851.70350: results queue empty 13271 1727203851.70351: checking for any_errors_fatal 13271 1727203851.70352: done checking for any_errors_fatal 13271 1727203851.70353: checking for max_fail_percentage 13271 1727203851.70354: done checking for max_fail_percentage 13271 1727203851.70355: checking to see if all hosts have failed and the running result is not ok 13271 1727203851.70355: done checking to see if all hosts have failed 13271 1727203851.70356: getting the remaining hosts for this loop 13271 1727203851.70357: done getting the remaining hosts for this loop 13271 1727203851.70363: getting the next task for host managed-node1 13271 1727203851.70367: done getting next task for host managed-node1 13271 1727203851.70368: ^ task is: None 13271 1727203851.70369: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 13271 1727203851.70370: done queuing things up, now waiting for results queue to drain 13271 1727203851.70371: results queue empty 13271 1727203851.70372: checking for any_errors_fatal 13271 1727203851.70373: done checking for any_errors_fatal 13271 1727203851.70373: checking for max_fail_percentage 13271 1727203851.70374: done checking for max_fail_percentage 13271 1727203851.70380: checking to see if all hosts have failed and the running result is not ok 13271 1727203851.70381: done checking to see if all hosts have failed 13271 1727203851.70385: getting the next task for host managed-node1 13271 1727203851.70389: done getting next task for host managed-node1 13271 1727203851.70390: ^ task is: None 13271 1727203851.70391: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=76 changed=3 unreachable=0 failed=0 skipped=60 rescued=0 ignored=0 Tuesday 24 September 2024 14:50:51 -0400 (0:00:01.018) 0:00:35.348 ***** =============================================================================== Install dnsmasq --------------------------------------------------------- 2.35s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.25s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.12s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 2.00s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.41s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.11s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.05s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.03s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Verify DNS and network connectivity ------------------------------------- 1.02s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.98s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.87s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Install pgrep, sysctl --------------------------------------------------- 0.78s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.76s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gather the minimum subset of ansible_facts required by the network role test --- 0.67s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.59s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.57s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get NM profile info ----------------------------------------------------- 0.57s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.47s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Stop dnsmasq/radvd services --------------------------------------------- 0.47s /tmp/collections-bGV/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 13271 1727203851.70529: RUNNING CLEANUP